Search engine optimization (SEO)


Machine-readable sites
- make the bots happy

Search engine optimization (SEO) is the virtual display window for your website, e-commerce store or web platform and ensures you can be noticed and attract visitors. Technical SEO is a cornerstone for CMS and eCommerce software.

To drive traffic the sites need to be machine-readable for search engine crawlers, AI agents and other bots. This is done by giving context to what the site is about in a way the machines understand and indicate the structure of the site and which pages contain information that should be scraped. But not only that, search engine result page results and page shares on social media, need to be given curated information and look good.

Performance optimization
- set the foundation right

The foundation for good SEO, especially for Google, is a high-performance and well-functioning webpage. Cradle software, a server-side rendered system, is built for a low time to first byte (TTFB), even without CDN off-loading or aggressive cache services connected, which simplifies the process of shipping solutions with excellent Core Web Vitals metrics.

  • 10x performance
    Our benchmark tests show that our application handles 10x more requests than php- and js-code stacks.
  • Image handling and scaling
    The system has support for webp, jpeg, png and gifs and always for adding an alt-text. Scaling down images is done with image filters in the templating language or filename. More on filters
  • JS-free functionality
    Cradle websites are built to also work without JavaScript as serving HTML and CSS to the browser is still the most efficient way for browsers to render the page.

SEO features

A CMS and eCommerce needs to be able to provide specific technical features for SEO such as sitemaps, robots.txt files, IndexNow support as well as make it easy to create content for SERPs and understandable URLs.

  • Automatically generated XML sitemaps
    A sitemap is a file that lists all published content on a site and helps website crawlers, such as Google, to crawl the content more efficiently. An open site always has a sitemap of the published content located at the URL https://mydomain.com/sitemap.xml
  • robots.txt file
    A robots.txt file is instructions for bots that crawl your site, with Cradle CMS a file is generated from the content given in admin under Site settings, making it easy to control the content. The robots.txt file can be used to control the crawl budget and give directions on how the website should be crawled to search engines. Go to guide on site settings.
  • IndexNow support
    Search engines need to be updated of new and updated content and this is done with IndexNow functionality. To be able to do this a host key is generated for each site. Read our guide on IndexNow
  • SEO metafields in admin
    All pages, blogs and articles (as well as collections and products for eCommerce) has a section for content, such as meta titles, descriptions, and keywords, used for search engine results page (SERP) can be added.
  • Descriptive and modifiable URLs
    The URLs are automatically created from the title but can easily be modified. Also, the slugs in the system can be modified.
  • URL redirect management
    301-redirects to use when deleting or changing a URL of a page.
  • HTTPS support
    Automatic https certificates from Let’s Encrypt and other ACME compliant providers.

Themes with code freedom
- for modifications and customisations

Prebuilt themes, optional to use, have full code access for modifications, improvements and customisations to the company or organization. Theme templating functions, such as image size filters give control over the images used on the web.

  • Context with structural data
    Themes are built with structural data snippets included and can be modified for specific use cases.
  • Metadata for social media
    Themes have open graph data automatically created from the content and site information.
  • Canonical links
    Canonical URL will keep track of the URLs and indicate to crawlers is the content is a duplicate. With tag {{ canonicalUrl }} the system will automatically generate the canonical URL. This is usually included in all themes in the head.