Slickplan

Website Architecture: Users vs. Googlebot & What the Pros Think

The architecture of a website is the foundation of a good user experience (UX). And when you build a website, you have a few things you have to keep in mind for it to be successful:

  1. User goals — What are your users or audience there to do?
  2. Business goals — What would you like your users or audience to do?
  3. Search engine optimization (SEO) — What page names resonate with search queries so you can connect users to the right information?

Since the beginning of the internet, website developers and webmasters have tried to please search engine bots with various forms of keyword and linking strategies; from keyword stuffing to an emphasis on backlinks to a website. But website content hasn’t been the only victim of these tactics, as design and development, website architecture and site navigation have been impacted by what’s now considered “black hat” search engine optimization (SEO) techniques.

But as search algorithms become more advanced and more focused on delivering high quality content that users are seeking, how does SEO still fit in the picture, if it does at all?

SEO’s Early Roots

SEO was introduced in the early 1990s. As web developers created web pages and websites across the World Wide Web, it was important that those pages be easy to find, and thus the search engines and their crawlers were born.

archie screenshot

The first search engine, Archie (short for “archives”) debuted in 1990 in Montreal. Yahoo! And Alta Vista hit the search scene in 1994, and Google made its premiere in 1998. Google’s introduction was groundbreaking, too, because it also released PageRank, which ranked pages based on links.

All of this led how internet users searched for information and the changes that we see today. Algorithms that once matched keywords-from-query to keywords-on-page meant higher rankings for websites that delivered perfect matches. But as black hat SEO techniques — such as keyword stuffing or keyword cloaking — became popular, search engines changed course.

With the release of Google’s Penguin algorithm rollout in 2012, the game changed. Search engines pivoted to rewarding well-written, user-friendly websites. If your site was still practicing keyword stuffing, cloaking, or hidden redirects after this shift, you likely saw a dramatic dip in your search ranking.

🎬 Learn what Slickplan can do!

We filmed a short video to show you exactly how to use Slickplan

What Search Crawlers Look For

Today, search engine crawlers have a few specific pieces of information they look for in any webpage, when understanding the context and how to rank it among similar results. And in nearly any website on the internet, you’ll find staple traits and pieces of information that build what we see and understand as web pages. These include:

  1. URL – The uniform resource locator – known as the URL – is unique to each page of a website and connects the pages together. If structured linear to the site’s navigation, pages should be connected, for example, /services/, /services/roofing, /services/roofing/residential.
  2. HTML page title – This 55 to 60 character-long string gives a unique keyword or phrase, business name, and sometimes a location to help search queries deliver the most valuable results. The HTML page title can, but doesn’t always, match the page name or H1.
  3. Meta description – At no more than 155 to 160 characters, these short, storefront descriptions appear beneath search engine result links to give a snippet of what the page or link contains, hopefully giving a solid description to the search user.
  4. Headings and subheadings – These are the lead headers of a webpage (H1s) and usually correspond with the page title. The subheadings – H2s, H3s, and so on – help organize content beneath the main topic and walk readers (and bots) down the page of content.
  5. Content – Every website has copy somewhere, whether it’s a long page of paragraphs, or a short page with infographics. Content is essential to bots because this is what’s crawled to index in the search results.
  6. Images – Images, like content, are also vital to the user experience. But behind images is necessary alternative text, or alt text. Alt text is important especially for blind or vision impaired users who rely on tools like screen readers, but it also helps Google and search engine categorize images in the image search index.

But checking these boxes doesn’t mean you’ll get first listing on a search query. The way the site is built, how content is written, and how it delivers an experience on devices of all sizes greatly weighs on the success of any website in search ranking.

Why Your Mobile SEO Presence Matters

mobilegeddon

You’ve got the five essential elements of a webpage in place, but your website is still missing the mark. Why? Because mobile matters.

Before responsive websites became the status quo, brands often built a “mobile version” of their website to be accessible by smaller screens and devices, such as smartphones. These mobile versions — often with a ‘.mobile’ URL extension — were scaled down versions of the desktop website: Often a shorter, smaller menu and less text or content.

But in April 2015, Google released what became known by web experts as “Mobilegeddon.” Mobilegeddon put mobile-responsive websites higher on the list of search engine indexing priorities. Mobile-friendly testing tools were released, preparing web developers and webmasters around the globe for the shift.

The mobile-version sites quickly saw a dip in traffic. Non-responsive websites really took a hit in the rankings. Mobilegeddon made sure websites were built to perform on the growing number of devices in the world, from smartphones to tablets, to wearable technology. Mobilegeddon turned that belief on its head too.

But a responsive site wasn’t the only thing that Google, and its search engine brethren, cared about when it came to indexing.

The Fine Line Between Keywords & UX

It’s mid-2015, and you waved goodbye to the “mobile version” of your website with a sleek, responsive website that looks great and works seamlessly from desktop to smartphone. So why didn’t your search rankings fly to the top of page one?

Because there’s more to getting on Google’s good side than a responsive website. Thibaud Jobert, community manager at CodinGame, knows this all too well.

“It’s great that algorithms are getting smarter,” Jobert says. “But you cannot publish content without taking bots, or more generally search engines, into account.”

Keeping his community and users up-to-date with fresh content is important, and he understands how to weigh the interests of his online community with search.

“The article could be the coolest to read, but if no search engine displays it for any search, no one will read it,” Jobert says, adding that if you want your article to be “organically foundable,” you have to know user intent behind keywords so you can deliver the best results. At the very least, Jobert keeps basic SEO research and best practices as a must, even for guest contributor articles, so they can be found by the community.

But Jobert also knows there’s an important line to balance between evergreen content — content that can age well in today’s quick-changing world — and short-term content, like community announcements.

At the end of the day, my main goal remains that readers enjoy the reading,” Jobert says.

And that’s the consensus among many in the web community, and precisely what search engines are prioritizing. Web expert Neil Patel says it quite simply: “Strong site structure gives your site an unbreakable SEO foundation that will provide you with vast amounts of organic search.”

Quality Website Navigation

A sensible website navigation is the foundation of a good UX. In any content strategy or web developer toolkit, Steve Krug’s book Don’t Make Me Think: A Common Sense Approach to Web Usability is invariably at the top of the list. “Users shouldn’t have to puzzle over finding the content they’re looking for,” Krug writes. “They’ll quickly move on if they can’t find what they’re after.”

Carrie Hane, content strategist and coauthor of Designing Connected Content: Plan and Model Digital Products for Today and Tomorrow, recommends content modeling to help structure navigation, URLs, and site menus.

content modeling

“We use those [content] types to plan our navigation and site menus, and those can be whatever they are. And they can keep changing when you map URLs to content types,” Hane says. “People don’t take URLs and navigation into account as often as they should. When you structure your content properly, and create a bunch of entities, the URL gets structured based on the entity instead of the site map.”What’s often overlooked, too, is linking related content via crosslinking. By creating paths for the user to follow, you’re also creating paths for search engine bots to follow (and index), too.

“Links from other sites to your site from others are more important than within, but the more things link to a page, the better Google’s going to look at that better,” Hane says. “And it helps your users. No page should be a dead end.”

Hane says that this type of connection between content helps build silos of topics, which Google and search engines recognize and group together to help deliver quality results for users, making a website more authoritative in the eyes of search algorithms.

Quality Website Content

While Hane and other UX professionals agree that deep keyword research can be valuable, Hane also sees value in low-level keyword knowledge to help lead the development of appropriate page names and content.

“What we start with is, at the minimum, is think about what people are actually looking for,” Hane says. “They’re going to search with what’s on their mind. Is that the same words you use? Start with the Google search bar and start typing to see what the queries are. Without using any special tools, it gives you a baseline.”

At a deeper and more advanced level, keyword research with tools like Google’s Keyword Planner and Moz Keyword Explorer help you understand popular keywords that may resonate with your users.

And when it comes to on-page copy, too, the best web practices still apply. Headings and subheadings, bulleted lists, plain, user-focused language always garners the favor of both search engines and users.

Hane recommends the pyramid approach, starting with the most important information at the top of the page, and filtering down, using subheads to introduce other topics. Subheads, Hane says, are part of the semantic language that Google and search engines understand.

Want to know more about how Google ranks quality content? Check out Search Engine Land’s Periodic Table of SEO.

Quality Website Design & Development

Site navigation and page content mean very little if the website’s design experience and development isn’t built well. And one of the most important components of a well-built website is responsiveness.

Design for Responsiveness

As previously mentioned, the “Mobilegeddon” mobile-first algorithm rollout from Google in 2015 prioritized responsive, mobile-friendly websites over others. Kirill Sajaev, an SEO consultant and member of the Slickplan team, points out that the use of “mobile version” websites was an attempt for developers to put UX before bots.

“If you don’t have something on your mobile site, but have it on your desktop, Google may not index it,” Sajaev says, adding that minimizing the mobile version website with less content and fewer menu items was an attempt to “put UX before bots” by making content easier to find and digest, delivering only what the brand or organization deemed most relevant.

Sajaev also recommends that website teams have an understanding of article pages and product pages, noting that “articles [are] for SEO, and product pages are for conversion.” For web teams, it’s important to know how those pages are built, and where they fit in an effective user journey. Knowing when and where to place calls-to-action (CTA) on the page, and how a piece of content marketing cross-links to a specific product is essential to the experience.

Develop for Technical Success

Design and content only succeed as well as the foundation — the code, the development — allows. Nearing the top of the ever-growing list of important SEO factors is page speed.

Page speed is part of a website’s optimization, and determines how fast a page loads on a screen or device. Google’s PageSpeed Insights was released in 2010, and over the last several years, has made it an important factor in how the search giant ranks and indexes websites and links.

Also important to the pages of your site’s code is structured data markup, also known as Schema. Per Google, “Structured data is becoming an increasingly important part of the web ecosystem.” Why? Because structured data helps search engines understand the detailed information about a page so it can properly return answers based on search queries.

Signifiers in the code such as names — people, places, events, and more — can help Google and search engines properly understand the page to return it to a user seeking a query. Even embedded media, like videos or audio, can be identified with this code to help not only users, but also increase the likelihood of the page appearing in rich snippets.

Don’t Forget About Inclusivity

And through all of your creation of quality content, design, and code should be the underlying need to make your web experience inclusive for all.

Inclusivity is vital to the user experience. Users who rely on assistive devices and software, such as screen readers, are equally part of the success (or failure) of your website on search rankings.

Why? Because essential pieces of the inclusive experience are the underpinnings of good SEO: Unique, well developed page titles, meta descriptions, content, and URL structure deliver solid results for users as they do search engines.

And so does a great user experience. Contrasting colors, easy-to-read font, alt tags, audio captions, transcripts, and more are important to delivering a UX that everyone, no matter the device, can use.

Slickplan Transcript

Building an inclusive website for all users isn’t just a great way to boost your SEO opportunities, but it’s also a great way to deliver exceptional experiences to all.

Design user-friendly sites with Slickplan

Use our easy drag-and-drop interface to ensure people can get where they want to go.

14-day free trial
No credit card required

There’s No Right Answer for Users vs. Googlebot

It would be easy if all website professionals could say “We build our websites for ____,” but that simply couldn’t be true.

Thanks to shifts in algorithms, search engines are happy to award higher rankings for websites focused on high quality, user-friendly content, a solid user experience, and inclusive design and development.

At the same time — and in the same breath — it’s still valuable that web developers and content experts understand how and why users search for information online — and create experiences, websites, or web pages that meet those needs.

At the end of the day, understanding and practicing both exceptional UX and the evolving language of search can keep any organization ahead of the digital pack.

Erin Schroeder

Want more free content like this?

Tips & tricks, how-to’s and deep dives delivered to your inbox 🚀

Design user-friendly sites with Slickplan

14-day free trial
No credit card required

You might also like

Website structure A to Z (with examples)

Website structure A to Z (with examples)

Imagine a city without road signs or a building without a clear layout – navigating through them would be a nightmare. The same goes for a website, the structure is…

Refine UX with a superior sitemap

Sign up