SEO Trends That Will Continue To Dominate In 2015
The process of ranking websites has changed drastically since the year 2010 and earlier, and the process will continue to develop. Search engine marketers need to remain analytical and constantly adapt to changes and improvements to search engine algorithms and the factors which dictate the ranking of websites in the search results. Those that fail to pay attention to what is important now and what is likely to be important in the future, will find it difficult to survive in 2015’s SEO world.
Below I will detail some of the most prominent trends which are developing with regard to search engine rankings:
A website needs to be optimised around user interaction and optimised for smart-phone users
As smart-phone continues to increase, and is set to continue to increase in 2015 (up to 50% of total searches) it is ever more important for webmasters to design websites that are either fully responsive for smart-phones and tablet computers, or to display a separate website to website visitors using smart-phones which is optimised for their device.
Google wants to deliver quality content that helps web users find what they are looking for and part of that means delivering content which is designed to be read on the device that you are using.
Within Webmaster Tools there is a section specifically for Mobile Usability to help webmasters understand how Google views their mobile site as being and how they can make improvements if need be.
Though having a website which is not optimised for mobile phones may not directly impact your search rankings for desktop computer use; as web searches are increasingly being made with the use of a phone, websites which are not easy to interact with using a phone will likely see a major impact to their levels of traffic going into 2015.
Luckily Google are transparent in this case about what they see as being the most common problems for websites which are not well optimised for mobile phones and how to fix those problems.
Problem: Flash-based content may not be displayed on mobile phones
Solution: Don’t use Flash on your website
Problem: Viewport not configured or a fixed viewport, preventing visitors from viewing your site in varying screen sizes
Solution: Use a responsive website design which alters the view-port according to the screen size of the visitor’s device.
Problem: Content not sized to viewport – meaning that because the content is too big, users are forced to scroll side-ways in order to ready the entire page.
Solution: Use relative width values in your CSS code to ensure that the content changes size in response the visitor’s screen size.
Problem: Small font size – if you have to reverse-pinch and zoom in in order to read the text on the page, then the text is too small.
Solution: Specify a viewport for each of your pages and set the font size to scale inside of the viewport. A good way to do this is to set a base font size of 16 CSS pixels, and then simply make other font sizes relative to this base size.
Problem: Touch elements are too close, causing website users to click two buttons at the same time.
Solution: Make sure that each button has a minimum tap target of 7mm and that there is adequate space between buttons
Brand Name Citations and Mentions Will Become a Strong Ranking Factor
Possibly as a response to the on-going misuse of links within private blog networks, Google appears to be placing a higher value on what can be called “Implied Links”, i.e. the mention, citation, or reference of a brand or website without a link going back to the website being mentioned. No-follow links from the correct source will become just as powerful as do-follow links. SEOs will need to be aware of the power of having their customers brand name mentioned by other websites, something which one would naturally expect for a well-known and well respected brand.
Social Signals from Facebook and Twitter Will Become Increasingly More Powerful
So many business and internet marketers within the last few years have been pouring effort into building out their presence on Google+, including the failed authorship profiles, which were eventually taken our behind the barn and shot. This effort in many cases has been wasted, as Google+ does not provide a lot of SEO benefits and the social platform has not been adopted with as much verve as Twitter and Facebook.
Many of Google+’s users use it because they think they should, to aid their website’s rankings, rather than for pure enjoyment, which ultimately detracts from its overall value as a marketing tool.
Google has not indicated that social signals are a ranking factor, but it seems unlikely that such a natural sign of authority would not be part of Google’s algorithm in some way. Popular websites and brands generally have a large social presence, and it would be short sighted not to take this into account when gauging how influential and relevant a website is in its subject area.
Going into 2015, we will likely see that having a strong social presence on Facebook and Twitter will be crucial in seeing high rankings for a website.
SEO is Becoming More to do with Building Relationships
Simply focusing on building large streams of well written, and technically sound content isn’t always enough anymore. Webmasters need to look to connect with other websites, building long standing relationships, in order to get their content out to a wider audience. This can be achieved through:
- Content syndication
- Blogger outreach campaigns
- Brand advocates
- Social media
Negative SEO Will Become an Even Greater Threat
Building thousands or even hundreds of thousands of “spammy” or garbage links pointing toward a competitor’s website in order to negatively affect their rankings is a method used by many Black Hat SEOs in the industry.
Google’s Matt Cutts talks about Negative SEO in this video:
To summarise the video – Matt’s answer to negative SEO is that there are three ways negative SEO is combatted against:
- Manual reviewers
- Spammy links will not always have a negative effect on a site’s rankings, and can even in some cases have a positive effect
- You can use the disavow process to tell Google to ignore certain links
For those affected however, spammy looking links consequently may have to be manually removed/ disavowed by the affected webmaster, which can be a lengthy and costly process. This process can take up to 3-9 months! Google may not even recrawl all of the links within a webmaster’s disavow file, meaning that any penalty attributed to their website may never be lifted.
So what is the solution to this problem? At this stage there doesn’t seem to be wholly reliable one for small websites in inconsequential niches – although as Matt says, these are not often the main target for such tactics. For big players, such as Money Super Market, in the financial comparison niche, manual reviewers find it easy to see that their big brand is more important than their link profile and thus negative SEO is very unlikely to have an effect on their ability to rank.
Leave a Response