What is hidden website content? Any website content that searchbots have to interact with in order to display is considered hidden website content. Why this is such an issue is pretty simple, if searchbots can’t see the content, a website doesn’t rank for that content at all. There are a few ways hidden website text can occur, both via whitehat means as well as blackhat means.

Understanding what to look for and the intentions behind can help determine an actionable solution. Overall hiding content can have unintended SEO outcomes, often leading to poor keyword ranking or lack of ranking at all. Proper on page SEO audits will help catch potential issues like hidden content from harming website rankings.

Blackhat SEO Reasons For Hiding Text On Websites

Text hidden on websites was considered to be content cloaking when content was hidden either by changing the color of text or using a CSS rule to hide. The negative SEO tactic used to work since the content was hidden from users but searchbots could see it. Thus the website didn’t display huge lists of randomly related keywords but searchbots did. In late 90s and early 2000s, searchbots merely crawled websites for keywords and would rank just from finding a keyword match. Since then searchbots have become better programmed and require many more variables before ranking a website.

Manipulative Intentions To Hide Content

  • Keyword Spamming – It was thought that listing keywords in bulk would encourage better keyword rankings.
  • Hiding backlinks – Backlinks have been a big ranking factor, some websites would hide their links to other non relevant websites, to be part of bigger link networks.
  • Competitor Branding – Very unethical, websites would list competitor brand terms in attempts to rank for competitor user searches.
  • Misspelled Keywords – SEOs even used incorrect keyword spellings and hide them in order to rank for the incorrect user searches.

Whitehat SEO Reasons For Hidden Content

Various website elements follow trends and allow for levels of interactivity. Because users need to interact with website elements to display content and searchbots do not interact on the same level. The content inside the interactive web elements like accordions, tabs, and other likeminded html elements is hidden to searchbots.

What Are Some Legit Reasons For Hidden Content?

  • Navigation – Sometimes less is more, especially with webpage links on a website. This is why we use main menus and drop downs, otherwise websites would be walls of texts and links.
  • Paid Content – Commonly called a ” Paywall ” The New York times is a good example, allowing peeks into content the 1st click and hiding the content thereafter for paid subscription.
  • Responsive Design – When websites change based on devices viewing, this is called responsive design, the content can often be hidden as not to overload users on smaller devices. Since desktops have more room, some content is ok to hide to make user engagement more optimized.
  • Multiple Web Browser Use – Not all web browsers work the same, older browsers need different code than newer browsers. As a result content needs to be coded differently for older browsers but hidden to newer browsers to function. The content is the same, just the way it talks to the browser has to change.

How To Discover Manipulative Hidden Website Content

There are a few different ways to effective hide website content. Some of which are pretty crafty and others are just slight of hand basically. When Google search console management and website indexing optimization is done these types of issues will stand out.

  • Same Colored Text As Background – A very simple way to hide content is to make it the same color as the background. But a simple highlight of all text on the page can uncover this method.
  • CSS Rules To Hide Text – Using (display:nonevisibility:hidden, height:0width:0text-spacing:-1000, etc. ) can hide text as well, but can be uncovered by disabling CSS & JavaScript or peaking at the source code.
  • User Agent Detection – Using server side code that detects web browsers, savvy users can detect searchbots and show certain content based on which searchbot is detected. This methods effectively hides the content well, and is only visible by changing the identity of your browser, accomplished by using browser plugins or extensions.
  • IP Address Detection – Like detecting the user agent method above, when certain IPs are detected, content can be blocked or even served different content. The use of proxies would be a good way to side step this issue.
  • Reverse And Forward DNS Detection – Since IP addresses can be spoofed using proxies and such. Content cloaking can happen via the same way paid content makes it possible to view their content once but be forced to paid to view again. A way around this is to use the Google translate feature to check for hidden content.
  • User Website Wrangling – There is also hiding content by misdirection. Some marketers do this as a means to try and force users into goal funnels. By using website navigation and user styling, websites can attempt to steer users into pockets of websites that allow only a limited navigation menu usually customized to the subject matter. All in an effort to force the user to stay in the section of the website.

Unsure If your website is hiding content?

Contact SEOByMichael to schedule an SEO audit and let’s develop a SEO strategy plan for website success!