Hotel SEO is one of the most profitable niches on the internet right now, but the market is large and it’s not too late for newcomers to gain footing and achieve success. It may seem daunting to manage your hotel SEO strategy for such a competitive digital space, but it’s not as complex as it seems. We’ll break down everything you need to know to start seeing results.
Site Design and Health
Site Structure and Navigation
As a hotel SEO agency, we can tell you that it’s important to users and search engines alike for a website to have an organized structure and be easily navigable. This means that content should be grouped together in logical categories, and this content should be inter-linked in easily understandable and convenient ways. When it comes to search engine optimization for the hotel industry, this should be a fairly simple process. Depending on your site’s focus and revenue model, it probably makes sense to focus on top-level categories like hotel chains, hotel locations, hotel types (e.g. luxury vs. budget), etc. and then to break down each category into more granular subcategories.
When conducting search engine optimization for hotels, keep in mind that both search engines and users glean information about a site’s structure based on URLs. Utilizing a logical top-down subfolder hierarchy is considered best practice. For instance, http://www.hotelsite.com/locations/texas/austin/best-western-123-elm.html is a much more easily understandable URL than http://www.hotelsite.com/content/7423/new/?=7423. Suboptimal URL structures can lead to ranking deficiencies and less prevalence of special presentation in search engine results pages (SERPs), like ‘sitelinks’ (example pictured below).
Navigational elements should follow a logical pattern similar to that of URL structures. Avoid utilizing extraneous layers of navigation — after 3 or 4 steps from the homepage, the structure becomes cumbersome to follow and internal PageRank transfer is diminished dramatically.
Mobile-Friendliness and AMP
Our SEO agency believes it’s more important than ever for your site to work seamlessly on mobile devices. The majority of website visits now take place on mobile devices and search engines have long taken notice and prepared for this inevitable trend. In fact, Google is now rolling out what they dub their “mobile-first index”, which means that all search results will be indexed according to what their mobile crawler sees. Once the rollout is complete, desktop and mobile search results will be the same for the most part (excluding location-specific personalization), so it is crucial that your site loads fast and is easy to use on a smartphone.
Regardless of ranking factors, people looking for hotel information will find benefit in a good mobile experience since a considerable portion of them are likely to be mobile users. For instance, it’s safe to assume that someone whose flight was just cancelled and is now scrambling to find a hotel for the night is not going to bother with a site that is difficult to use on their phone.
The days of heavily stripped down “m-dot” sites (e.g. http://m.example.com) for mobile users are mostly gone. Responsive web design — in which elements on pages are dynamically re-scaled and re-positioned based on the viewport resolution — is widely considered best practice today. However, Google does see some value in sometimes pointing searchers toward stripped down, quickly loading pages that are reminiscent of m-dot designs. They call this their Accelerated Mobile Pages project, or AMP for short. If you host AMP versions of your pages and they conform to all of the critical AMP requirements, they may be eligible to take indexing precedence over their non-AMP counterparts. Although aesthetically and functionally constrained by AMP requirements, these pages may see higher click-through-rates (CTR) and reduced bounce rates due to their special presentation in the search engine results pages (pictured below) and fast loading times.
Search engines are getting better and better at taking user experience into account, so make sure that your site is easy to use on a mobile device. Avoid making buttons and links too small since these are difficult to interact with on small touchscreens. This is vital for hotel industry SEO since a faulty call to action button could lead to a direct loss in revenue/conversions. Be sure that text wraps properly and that images do not overlap with text. Make sure that all page elements containing content do not become hidden for mobile users (and bots) since Google obviously can’t take hidden content into account, and the mobile-first index means that content only visible to desktop users/bots essentially does not exist as far as Google’s concerned.
You can gauge the mobile usability of your site using Google’s Mobile-Friendly Test, Google Chrome’s built-in DevTools (viewport controls), and Google Search Console (Mobile Usability section — pictured below). But be sure to also test your pages by rendering and interacting with them on a variety of actual mobile devices.
As mentioned previously, site speed is hugely important, especially given the move to Google’s mobile-first index. It is also a confirmed ranking factor. A significant number of users only have access to 3G mobile speeds or worse, and the intense competition in the hotel SEO space means that users with access to high speeds are still likely to turn to other sites when faced with a slower-loading one.
There are many elements that go into cutting down on load times, and the good news is that most of them can be accomplished fairly easily. Below are some of the most important site speed factors and links to implementation instructions.
- Use Gzip to compress files (which are decompressed by the client)
- Cut down on unneeded redirects by linking directly to destination URLs
- Use appropriate image compression and sizes
- Make use of CSS sprites for navigational images
- Utilize browser caching
There are many great tools available for page speed diagnostics, such as Google’s PageSpeed Insights, Google’s TestMySite (mobile speed), GTmetrix, WebPagetest, and Google’s Lighthouse browser extension.
There is a practically incomprehensible number of discoverable pages in existence on the world wide web, and obviously it takes a finite amount of time for a search engine robot to crawl each page. Not surprisingly, these robots must utilize their time wisely, so they do not crawl entire sites at a time (or at all) and appear to give precedence to reputable, fast-loading sites. You can help ensure that your ‘crawl budget’ is allocated wisely by making sure robots don’t waste time crawling unnecessary, extraneous content. This could be especially important if your site includes a large directory of hotels, such as dedicated pages for every hotel location in your database. See how your site is performing by entering it into our free web crawler tool.
The most direct way of influencing search engine robot behavior is through the use of the robots.txt file at the root of your site. In this file, you can instruct user agents (such as search engine robots) to not crawl certain pages, subfolders, or URL types. If your site contains duplicative pages which might be useful to users (via internal links) but offer little to no SEO value, such as blog “tag” pages, you can easily instruct search engine robots to not crawl them. Internal linking is powerful, and we offer internal linking services to make sure internal links are working for you.
You can also use wildcards for more advanced disallow directives, such as the blocking of certain URL parameters. We recommend using the Parameter Handling of Google Search Console for a greater degree of flexibility in setting how each parameter type is handled. Be sure you understand what you’re doing though, as you could easily block your entire site from crawling if you’re not careful (hint: don’t include “Disallow: /” in your robots.txt file). The Sure Oak Robots.txt Generator makes it easy to quickly generate a robots.txt file that you can upload to your root directory. Creating a sitemap with our XML Sitemap Generator is also an excellent way to make sure search engines crawl and index your important content. After the sitemap is uploaded, be sure to link to it within your robots.txt file and submit it via Google Search Console.
Another important way of dealing with duplicative content is the use of canonical tags within the HTML of each page. The page a canonical tag points to is an indication that is the definitive version of the page and that it should be given precedence for indexation. In theory, this might mean that PageRank from internal and external links to non-canonical versions of pages will all consolidate and flow to the canonical page. This is highly recommended when dealing with multiple versions of a single page that are created through the use of URL parameters. An excellent use for canonical tags for hotel SEO would be utilizing URL parameters to serve content based on specific hotel locations but for each URL variation to include a canonical tag pointing to the top-level hotel URL (e.g. /san-francisco/hilton?a=123elmst and /san-francisco/hilton?a=521fakeblvd both utilizing canonical tags pointing to /san-francisco/hilton). However, it is important to note that search engines are under no obligation to follow these commands and the ‘honoring’ of a canonical tag is on a case-by-case basis. For instance, trying to rank a page by consolidating PageRank from multiple unrelated pages (by pointing their canonical tags to the page you intend to rank) is practically guaranteed not to work. Generally, the content must be nearly identical for a canonical tag to be honored.
To write effective content and meta tags, it’s critical that you conduct some keyword research to get an idea of what types of words or phrases are being searched for by users. There are many tools available for this purpose, but some of our favorites are KWFinder, Ahrefs, and SEMrush. Most tools allow you to easily get started with your research by manually typing a keyword (or list of keywords) and seeing various metrics, as well as related keywords and their metrics.
Let’s start by entering “best hotel chains” in the Ahrefs Keywords Explorer. Here are some of the related keywords it suggests. We’ll use these as examples for filtering and categorization below.
Competition and Search Volume
When determining which keywords to target, competition levels and search volume are good factors to judge before moving further. They also are generally linked — queries with higher search volume tend to be more competitive because it’s common sense for more websites to try to go after keywords with higher traffic potential.
However, self-awareness goes a long way. It may be tempting to throw all your efforts into the keywords with the highest search volume, but if your site isn’t very established yet and/or you don’t have a lot of resources at your disposal, your efforts may be wasted if you prioritize them on queries with heavy competition. In the list of suggested keywords above, “best hotel chains in usa” and “best budget hotel chains” might be better targets for a relatively new site since they have a significant amount of search volume but are likely much less competitive than “best hotel chains” (which has nearly 10 times the search volume). On the other hand, eliminating contending keywords with low search volumes is a smart move since great rankings still may not bring in a worthwhile amount of traffic.
Intent and Other Categorization
After selecting a list of target keywords that have significant search volume but aren’t unrealistically competitive, it’s time to sort them into semantic categories for easier tracking. Intent and topic are logical group choices, and it often makes sense for those groups to have overlap since they can be thought of as separate modifiers for keywords. For instance, hotel SEO keywords could be split into “hotel research” and “hotel booking” (i.e. purchase phase) intents, and some example topic categorizations might be “best”, “business”, “cheap”, “reviews”, “luxury”, “chains”, “states”, “cities”, and “rewards”.
Sure Oak can take care of your keyword reserach, analysis, clustering and mapping with our advanced keyword services.
Before you even get started with content and link-building efforts, it’s a good idea to take a look at what your competitors are doing so that you can be better informed about what you’re up against.
Take some of the potentially lucrative target keywords you identified during your research and search for them on Google (preferably non-local keywords and using incognito browsing mode to avoid as much personalization influence as possible). Start familiarizing yourself with the usual competitors who occupy page 1 of the SERPs. We offer a handy free tool for this SERP analysis. The more attention you regularly pay to this, the better-equipped you’ll be to spot changes that are reflective of algorithm shifts and/or increased competitive efforts.
You may be surprised to see that the websites with the best rankings can be very different from query to query, even if each query is fairly closely related. Google tries to understand the intent of the query (i.e. research vs. action) and serves different types of content accordingly. This can be a good thing since it sometimes prevents authoritative sites from having a ranking monopoly in certain niches, but it can be extremely frustrating when there seems to be a mismatch between the true intent of a query and the pages that rank for it.
Investigating the backlinks of your competitors is a useful way to learn about what kind of link building techniques are effective in the hotel SEO niche. Ahrefs (pictured below) is a great resource for tracking backlinks, but there are many other great tools for this purpose as well. Take note of the content types that have gained traction and which sites have shown a willingness to link to such content. You may be able to replicate these results by writing about similar topics and reaching out to the same or similar publications. If you find outdated links (e.g. the competitor’s page no longer exists) and you can tell what the topic was by the context, it could be an easy win to write about that topic yourself — then just reach out to the linking webmaster and inform them that the link is broken but that you’ve got content which could serve interested users.
Keep an eye out for groups of backlinks that have the same anchor text (or even bigger portions of identical copy). These may be evidence of content and links that were pushed out with the help of a public relations firm. A reputable PR firm can do wonders to bolster a brand’s online presence (and simultaneously improve their ranking power), but working with an amateur one that does nothing more than submit press releases to aggregator websites is a waste of money. Worse yet, some PR firms (as well as some SEO agencies) build links from small, low-quality sites they build themselves. These qualify as private blog networks (PBNs) and are likely to invoke a Google manual action penalty. If you spy links that appear to be egregiously created, you might consider submitting a Spam Report to Google. Your competitor then may eventually get the penalization they deserve.
Content Types, Authority, and Breadth
To really gauge the competition, you need to analyze your competitors’ actual content too. Take a look at how often they publish content and what types of content they tend to write about. You’ll probably need to take cues from them to guide your initial success (and ideally you’ll get to the point that you can go above and beyond what they produce). Take note of their style and credibility. Do they seem trustworthy? Who’s the target audience based on the writing style? If they rank well, the answers to those questions might be an indicator that they’ve done some experimentation and figured out what kind of content search engines think is appropriate for the respective well-ranking keywords, so it may be a good idea to replicate it. However, if none of your competitors for a particular keyword (or keyword group) seem to write with much trustworthiness, this could be a golden opportunity to create authoritative content and dominate the rankings. This might be a possibility even with an inferior backlink portfolio since Google’s algorithms attempt to take query intent into account, and research-based queries tend to favor pages with high scores for expertise, authority, and trustworthiness.
Competitor analysis is a critical component in building your SEO strategy. We offer SEO Competitor Analysis services to take care of this research for you.
Although search engine algorithms have become dramatically more refined and complex since the early days of keyword stuffing, there is still benefit to clever and strategic use of titles and meta descriptions.
Page titles are a confirmed ranking factor, and are obviously very important for succinctly communicating page topics to users. Keeping your page titles under roughly 55 characters is a pretty safe bet to avoid having them truncated in Google’s SERPs.
Meta descriptions are not Google ranking factors, but are still valuable chances to influence CTR by enticing users to click-through from a SERP. Although Google has tested significantly longer meta descriptions, roughly 160 characters seems to be the standard maximum length that will be displayed in a SERP listing without truncation.
Google has long stated that the meta keywords tag is not taken into consideration, so save yourself a bit of time and don’t bother with it unless you use it for internal categorization purposes.
Understanding and generating your meta tags is an important component for your SEO. Conveniently, we offer a free meta tag generator tool to prepare the all the tags your site needs.
Google’s Automated Replacements
Titles and descriptions are often replaced in SERPs by Google’s suggestions (such as a truncation and/or slight rewording of titles and/or pulling an excerpt from the content on a page as the SERP listing description). However, unaltered titles and descriptions are still common, especially when a page ranks for a query based on overall semantic relevance instead of an exact keyword match.
What Is Structured Data?
Structured data refers to any type of markup that provides additional information about page content to search engines. Simply put, structured data gives webmasters the opportunity to communicate more information and context about content without cluttering up the actual copy with extraneous information that may not be beneficial to users. The most commonly used type of structured data in SEO is Schema — a collaborative effort between Google, Microsoft, Yahoo, Yandex, and many other community members. Schema supports many encoding types, such as RDFa and Microdata, but most search engine optimization professionals tend to agree that JSON-LD is the easiest and most effective to use due to its modular nature.
The benefits of structured data are not always clearly tangible. Sometimes they’re obvious, such as the presence of rating stars in SERPs due to the use of AggregateRating Schema (pictured below). Other times, it may not provide any special SERP presentation (known as a “rich snippet”) but could provide additional semantic information to search engines, which could cause content to be viewed in a more authoritative light and/or increase the scope of ranking keywords.
Useful Schema for Hotel SEO
There are a variety of Schema item types that are potentially useful for hotel SEO, as well as other local SEO focused niches. Below are some examples. Be aware that utilizing Schema markup in an egregious and/or manipulative manner can result in penalty actions by search engines.
Backlinks and PR Efforts
Although many will proclaim “content is king” (and it’s certainly extremely important), it’s hard to ignore the fact that what made Google special in the early days of search engines was its (then unprecedented) focus on backlinks as the defining ranking factor. Most studies indicate that backlinks are still the single most important ranking factor, and it’s no surprise that Google has continued to give them relevancy, given that they continue to update their PageRank patent which was originally filed in 1998.
Unfortunately, there is no “one size fits all” approach to acquiring backlinks, and the term “link building” has gotten a dirty reputation among webmasters and marketers who don’t work in search engine optimization themselves but may believe that SEO professionals are integrity-devoid spammers. The old-school tactic of emailing a point of contact at a website to explain the value of your own piece of content and ask for a link is still fairly common, but response rates have dwindled in recent years.
To be successful today, you either have to provide uniquely good value or you have to get creative — doing both certainly doesn’t hurt either.
Comprehensive original research tends to be a winner in terms of backlinks if you have the means to put together a lot of data related to the hotel industry. You don’t necessarily have to be a statistician, but your methodology and sample size should be sound and hold up to scrutiny — otherwise you may end up with an unintended backlash. When done well, a variety of sites are willing to link to this type of content (often without needing to be prompted) as a resource or a citation. Informative, long-form “guide” type pieces also tend to garner many backlinks. In the hotel SEO niche, there is a lot of potential for comprehensive travel guides to resonate with many potential linking websites.
If your content isn’t necessarily very unique or valuable, your efforts will have to go towards your pitch. Plant a seed by making regular contact with writers and establishing a rapport long before asking for a link. Engage in intelligent discussions about their content and become well-known to them in a positive manner. The more networking you do, the better your chances are that you’ll be able to later gain a backlink from one of your contacts — they’re going to be more inclined to link to you than to a complete stranger’s website. Of course, it’s likely not feasible to do that for a large number of contacts at one time, so you need to also get creative with your outreach in order to stand out. Conduct A/B tests with your email copy — maybe a more conversational tone will gain a better response rate. Try respectfully critiquing a writer’s point and hinting at your link as a possible correction citation instead of going for the usual flattering. Try sending someone a video pitch instead of an email. Pick up the phone and try to get in touch with a live marketing or editorial employee at a publication. Almost anything goes – take chances, but make them intelligent, reasonable ones when reaching out to others to build your brand.
Once you’ve compiled a list of potential websites to pursue backlinks from, the Sure Oak Domain Authority Checker is a great way to quickly gauge a rough value of the worth of a link from each of them.
Public Relations Value
Unless you’re fortunate enough to have a large number of high profile connections, there is likely added value in doing business with a reputable public relations firm for the purpose of backlink acquisition and branding. Authoritative publications are generally not as receptive to inquiries and requests from relatively unknown budding websites as they are to contacts from established PR firms. Increased brand awareness via other digital channels also leads to roundabout SEO benefits in the long run. But as mentioned previously, be cautious and avoid working with amateur PR firms (or other types of media agencies) who use egregious tactics for their clients, such as building their own PBNs (groups of inauthentic sites created for the sole purpose of manipulating search rankings via backlinks) in attempts to add perceived value.
Dangers of Spam
The Web Spam at Google is probably the most well known among the SEO community, and for good reason — they take spam very seriously and aren’t afraid to blacklist entire sites at the drop of a hat if they uncover seriously egregious violations of the Google Webmaster Guidelines.
One of the key things the Web Spam Team looks at is evidence of unnatural external backlinks — meaning backlinks that appear to have been created in exchange for money or favors. If substantial evidence is found, Google may issue a manual penalty to a page, group of pages, or website. A penalty will cause severe ranking drops and can be a death sentence for traffic. What exactly constitutes an unnatural link is a bit of a gray area, but link exchanges (e.g. “I’ll link to you if you link to me”) and PBNs are unquestionably forbidden by Google and will probably invoke manual penalties if uncovered. Some ‘leads’ about such behavior can come from your competitors via Google’s Spam Report form, so play it safe and assume that someone’s always keeping an eye on your backlinks.
With the use of a disavow file — a list of backlinked URLs or domains that you want to declare no association with — it is possible to preemptively avoid a penalty. A disavow file can also be used when submitting a manual reconsideration request to get a manual penalty removed, but this sometimes takes months or years of attempts before success, and many sites never see their traffic return to pre-penalty levels again. It’s also very likely that your site will be watched extremely closely by Google in the future if you’ve got a past manual penalty on your record.
Google has confirmed that “freshness” is a ranking signal for some queries, and it’s common sense that users want to find the most up-to-date information available. If you think you can improve an older article you’ve written, then it’s a no brainer to update it — especially if it already ranks well and brings in a significant amount of traffic. The freshness factor may give it a ranking boost that lasts for months. In addition, previous site visitors are more inclined to revisit if they see that content has been updated since their last visit.
Opening the doors to user-generated content can make for a great way to encourage more user engagement and increase keyword density. (You can check your keyword density using our free tool.) If you take the time to respond to users directly, it’s also an effective way to build trust and establish a positive reputation. The most common way of doing this is to utilize comment systems on articles, but you can also take it one step further and host a dedicated forum on your site. The downside is that you’ll almost certainly need to devote more time to cleaning up spam posts and moderating vulgar behavior from some users.
Tracking Your Efforts
All the effort in the world doesn’t mean much if you can’t accurately assess your performance and constantly refine your strategy. Keeping a close eye on your organic performance is an absolute must. Luckily, Google Search Console makes it easy to see which pages and queries drive the most traffic — and they’ve recently overhauled the tool and it now has 16 months of data compared to the previous version’s 90 days of data. However, if you want to group together different categories of keywords and more easily stay alert to ranking shifts, you’ll need to turn to a third-party tool. SEMrush and Ahrefs have excellent rank tracking capabilities for the price. If you need even more data, better automated reports, and want to hone your strategy based on more comprehensive proprietary metrics, an enterprise solution like seoClarity or BrightEdge might be right for you. For a useful free tool, we offer a Google Keyword Ranking tool to assess your website’s ranking for specified keywords.
As mentioned previously, Ahrefs is a great tool for tracking your own backlinks as well as those of competitors. It’s a good idea to keep a watchful eye on backlink trends and the types of backlinks being acquired to compare your trajectory with competitors and look for signs of a negative SEO attack.
Site Health Monitoring
The best content on the web coupled with an impressive portfolio of authoritative backlinks isn’t enough to thrive in SEO. Your site needs to be fast, reliable, and optimized. Again, Google Search Console contains a lot of sections that make it easy to stay aware of any major site issues, but it’s a wise idea to supplement it with other tools. Sure Oak has a free Website Crawler Tool to understand exactly how Google views your site. Site crawlers like Netpeak Spider and Screaming Frog are excellent for conducting audits which include response codes, meta tags, and URL structures. Ahrefs also comes with a great site audit tool which can be configured to automatically crawl your site at certain intervals. You can also stay aware of any potential server slowdowns with tools like Pingdom. Ensuring your site is in tip-top shape is a must if you want to see the fruits of your other labors. We offer technical SEO audits that can help confirm that your site is Google-ready.
You should now be well-equipped to run the day-to-day operations of an SEO campaign for any type of hotel website. SEO is continuously evolving and no professional is ever truly an expert in every aspect, so stay open-minded and continue to research and experiment. Sure Oak SEO is dedicated to the success of our clients and we’re always available for any questions and feedback you have for us. If your hotel work happens to take you to the Big Apple, feel free to drop by as we’re a New York-based SEO company. Check back often for updated guides and new articles.