SEO stands from Search Engine Optimization. In time, the online search engines improved, their algorithms evolve and "learn" to better recognize the meaning behind each search, not just the keywords it contains. This has resulted in search engine optimization going through a metamorphosis and eventually becoming an optimization for humans.
At its core, SEO is the process of improving the visibility of a website in search engines through "organic" or algorithmic results, which involves optimizing the HTML code, structure and text of a website. Advertising on the site and increasing backlinks is another strategy for SEO optimization. In fact, the earlier (or earlier in the page) and the more often a site appears in the results, the more visitors are likely to generate a search engine. Thus, the site gets a presence on the Internet. Optimization can include various aspects, such as searching in a specific area or region, as well as searching for pictures, videos, or news.
To reach the section, log in to your control panel >
A canonical tag (aka "rel canonical") is a way of telling search engines that a specific URL represents the master copy of a page. Using the canonical tag prevents problems caused by identical or "duplicate" content appearing on multiple URLs. Practically speaking, the canonical tag tells search engines which version of a URL you want to appear in search results.
Deindex Filtered & Sorted Pages
Set a noindex directive for any parameter based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page. URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to not follow the page’s links.
A good XML sitemap acts as a roadmap of your website which leads Google to all your important pages. XML sitemaps can be good for SEO, as they allow Google to quickly find your essential website pages.
The Sitemap is also automatically generated when you create your store in the CloudCart system. The Sitemap is updated automatically every time you add or delete pages from your store.
Manage your Robots.txt file
Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.
Robots.txt is a fundamental section of your SEO. You can control which parts of your online store are viewed and read by search engine bots by changing the contents of the robots.txt file. This file is automatically generated when you create a store on CloudCart.
Be careful when editing information in a robots.txt file, as changes may cause search engines to be indexed on all or some of your store pages!
Social media sharing
In this section, you can control the social media buttons in your store
RSS stands for “really simple syndication,” or, depending on who you ask, “rich site summary.” At its heart, RSS is just simple text files with basic updated information — news pieces, articles, that sort of thing. That stripped-down content is usually plugged into what is called a “feed reader” or an interface that quickly converts the RSS text files into a stream of the latest updates from around the web.
RSS feeds allow customers who want to keep track of the news on their favourite sites, in this case - everything you posted on your online store! Subscribing to the store's RSS feed frees customers from the worry of constantly monitoring something new. Instead, the browser constantly shows your store and informs users of updates.
Here you can:
- Choose the number of products and updates that will appear in your RSS feed. Enter the number in the blank box you see above.
- Check out the RSS feed link on your online store and inform your customers about it. The link always starts with http://yourstore.cloudcart.net/
You can copy the link by double-clicking on the .