If you are into SEO and SEM, you have probably noticed the fact that although Google is constantly changing algorithms, your website is not always affected. A lot of business owners pay less and less attention to this unless they notice a major bounce rate changes. This is why we, at 411 Locals, brought this article to you, as we will go through all the major SEO ranking factors in 2017. Panelists at SMX East recently discussed that and we will summarize everything that you need to know. The text below includes data – from large-scale studies performed by SEMrush – one of the world’s leading competitive research service for online marketing.
Online search marketing has significantly grown in the past years as the online search continues to develop from a novelty to a standard and regular feature in our everyday lives. We, at 411 Locals, help small and medium business get as much online presence as possible and can tell a word or few about the advantages and disadvantages of small and big companies. Last year, Moz’s founder Rand Fishkin published a similar article which has got our attention. On this page here, we will summarize most of it based on our experience with hundreds of clients.
We all know how fast Google changes algorithms and deciphering it might feel like an exercise in futility. This is why, we at 411 Locals, keep track of the algorithm changes and try to synthesize it for every SEO enthusiast. It is a well-known fact how many changes Google has made over the years, and on this page you will show you the brief history of those changes starting with the year of 2005.
2005: Google Maps and Local Business Center Merged
We all remember when Google released the Local Business Center in March 2005. The next logical step was to merge this free tool helping business to easily add or update their business listing that appear in google Local with Google Maps. The users could now easily condense relevant search results into a single location that included store hours, contact information, and driving directions.
Today’s topic of 411 Locals is what Google considers to be low-quality pages and what SEOs and marketers should do to avoid them. Let’s start by saying what constitutes quality according to this search engine:
- Unique content that is more than different words and phrases. The content should also provide value.
- Lots of external sources editorially linking to a page. If a page is reference-worthy, it must be high-quality.
- The page must be referenced by other high-quality pages, not just sources or domains linking to this. The links can be internal or external.
- The page should successfully answer the query of the searcher. How does Google know this? When someone searches for something, they perform the search and then click on a link. If the results are not satisfying to the searcher, they will click back and choose a different result. This is how Google learns a page does not answer the searcher’s query, especially if it happens a lot.
- High speed of loading.
- High quality accessibility and intuitive user design and experience on any device – desktop, laptop, mobile, tablet.
- Content that is grammatically correct and well-spelled.
- The non-text content should have text alternatives. For this reason, Google encourages the use of the alt attribute.
- Content which is organized well and is easy to understand and consume. Trust us, they have their ways of knowing this.
- Content which points to additional sources where from to get more information, follow-up on tasks, or cite sources. What does this are links externally from a page.
The list above is far from exhaustive, but it contains some of the things which tell Google which pages are high quality and which are not.
How SEOs and marketers filter pages on sites to identify whether their quality is high or low.
Here’s what you should NOT overestimate the importance of: (more…)
SEO was mostly about coding in its early days. Technical SEO involved the optimization, often manipulation even, of code, metadata, and link profiles, for the sake of achieving better results.
And that hasn’t changed all that much. But today’s article by 411 Locals is dedicated to a more balanced approach between the technical and non-technical ways of website optimization.
Since manipulation and black hat tactics have become dangerous and less effective, they are also less popular right now. This is how the more creative and non-technical SEO tactics came to be. They are designed with the value and relevance of content in mind.
Technical and non-technical tactics should work in harmony since both are important for the excellent condition of your site and the success of your campaigns. The technical part is the framework on which to build great content.
Here are 4 tips that will help you find a good balance between creative and technical:
1. Know the part that technical SEO plays in your organization
In most organizations today, technical SEO is completely separated from development. The departments SEOs work with are:
- The IT team managing the storage and reception of important data;
- Web development team;
- Non-technical SEOs (like link builders and content marketers).
Technical SEOs are kind of mediators among these fundamentally different teams. However, there is a second part of their job, and that is the implementation and adding structure and optimization which help the engines retrieve, index, and rank content. (more…)
Many people feel intimidated by SEO as they think it’s hard to keep up with all the latest changes and updates in the field. This is true to a great extent; however, there are certain SEO fundamentals that haven’t changed that much over time. 411 locals dedicates today’s post to these main factors.
Google’s basic principles can be described as:
- Crawlability – Whether the search engine can find your content.
- Site structure – How the search engine organizes and prioritizes this content.
- Keywords – Define the topic of your content.
- Backlinks – How the search engine knows your content delivers reliable information on a certain topic.