5 SEO Mistakes That Will Ruin Your Google Rankings

Home > Blog > General > 5 SEO Mistakes That Will Ruin Your Google Rankings
seo mistakes

There are a lot of things to think about when it comes to SEO. You want to make sure that you’re doing everything possible to optimize your website and improve your Google ranking. But even if you know the basics, it’s easy to make mistakes that can ruin all of your hard work. In this blog post, we’ll discuss 5 common SEO mistakes that can ruin your ranking – and how to avoid them!

Optimizing your website for Google is not that difficult if you know the basics of SEO. But sometimes it is easy to completely ruin the Google rankings with just a few clicks. Don’t mess with Google, and it doesn’t even have to be malicious.

SEO Mistake 1: Your website is invisible in WordPress

What do I mean by invisible? If you don’t allow search engines to read (crawl) your page, they won’t be able to index it either. There is no indexing and therefore no rankings. It’s a pity, chocolate, but then SEO won’t help either if you exclude all bots.

A popular SEO mistake here is setting the infamous WordPress tick:

discourage search engines from indexing this site
Make sure that this field is unchecked!

Under Settings > Reading you will find the item “Visibility for search engines”. This tick can only be checked before you publish your website. After that, it must be removed.

If you want to exclude your website from Google during the construction phase, simply create a “coming soon” page. You can find out how this works in WordPress in our blog post: Coming Soon Page with WordPress.

Prevent search engines from indexing this website: NO!

SEO Mistake 2: Your robots.txt file is blocking bots

If you are now wondering what is a robots.txt file again? Then I can only refer to your website. In general, CMS (Content Management Systems such as WordPress) creates such a file. By default, all bots and search engines are allowed, everyone is allowed to visit and read your website.

See also  How To Invest In Digital Real Estate

But sometimes you try one or the other tip and then the robots.txt file suddenly says “disallow”, i.e. do not allow bots to see certain websites.

This is what a good robots.txt file looks like

The code below means that you allow all crawlers to read your website. Asterisks serve as placeholders. There is no further instruction after Disallow. That means: crawl everything! The file also links to the sitemap that Google helps with crawling.

User agent: *

Disallow:

Sitemap: https://domain.de/sitemap_index.xml

This is what a simple and optimized robots.txt file looks like for a WordPress website.

This is how you (accidentally) exclude crawlers

The following code excludes all crawlers, i.e. Google, Bing, and all programs that want to read your website.

User agent: *

Disallow: /

The code only excludes Google crawlers. Of course, this should also be avoided if you want to get in Google rankings.

User agent: Googlebot

Disallow: /

Tip: Check your own robots.txt settings. The robots.txt, if available, is always in the root directory, i.e. in the first level: https://domain.de/robots.txt

Test your robots.txt in the tester tool:

Here you will find a good Robot.txt tester tool

SEO Mistake 3: You buy backlinks for better ranking

How does Google like it when many websites link to your site? In theory, Google loves backlinks. And here comes, you guessed it, the but. Google loves natural links. These are links that webmasters don’t build themselves.

On closer inspection, this is also logical: real recommendations come from the heart. It’s like dubious products from the online marketplace, where you can already see the reviews you bought in the first sentence. They bring nothing to customers.

For example, there are many offers for links on Fiverr or various other platforms to buy. There you get, for example, 5000 backlinks for only 10 dollars… never mind… these links will hurt you more than they would help you. These links mostly come from spam websites or adult content websites.

fiverr backlinks
Please never buy backlinks from Fiverr or any other provider. These backlinks will do your site more harm than good! These backlinks mostly come from very questionable websites. You have to earn real backlinks!

Buy links used to work

But that’s exactly what Google wants: to make customers happy. This is not possible on the basis of purchased links, i.e. recommendations. Buying links and ranking better used to work perfectly, but now Google is even more behind uncovering artificial link building.

See also  How to Use a PC or Laptop as a Heater

At worst, this type of manipulation can result in your website being penalized. Then you will find a notice in the Google Search Console. As a result, either your entire domain or just a section can be made virtually untraceable in Google. Anyone who ends up on page 5 in Google no longer exists for the users.

I can only advise against buying links. I am very sure that sooner or later the dizziness will fly and there is always the danger of being caught. And you can also ask yourself: In case of doubt, what would the purchased link do? Real recommendations through real links should bring visitors to your website: can buy links from the junk box achieve that?

SEO Mistake 4: You have a duplicate content problem

Uhhh, a classic! What happens if you accidentally create your content twice or three times in your system, or if your system does it for you? Duplicate content is duplicate content.

That happens super fast. In online shops, it leads to duplicate content if you make a product available under several URLs, i.e. it can be reached from two categories and has several URLs. The same applies to color variants in shops. But even on websites without a shop, duplicate content can quickly occur.

Have you thought of setting up a redirect if your site is reachable over http and over https? Do you have large blocks of text spread across multiple pages? These are the typical duplicate content errors, but there are far more.

The problem that Google has is: which page is the most relevant? If there are several identical pages, this can affect the ranking. In the worst case, your well-ranked page will drop because Google has several URLs to choose from for search queries. Then both URLs suffer and neither appears in the top 5 in Google. Oops!

See also  ScreenFlow vs Camtasia

Content must be unique. If you have duplicate content, Google will evaluate this and then decide which URL is most relevant for a search query. As a result, the other URLs rank worse or not at all in Google.

Duplicate content can also arise if your website is available in different languages. For example, if you offer English and German versions of your website, this can quickly lead to duplicate content.

The solution is quite simple: if you have the same content on several pages, choose one URL and redirect all others to this URL. If you want to offer your site in different languages, use subdomains or folders for this. Do not use different domains because that would be a new website and you would have to start from scratch with the ranking.

SEO Mistake 5: You’re not doing SEO

As an SEO expert, in my opinion, this is the most serious mistake that can mess up your ranking. You may even have already achieved good Google rankings with parts of your pages, hooray! Then it’s time to stay tuned and keep an eye on the pages and always improve: Do SEO!

It’s also called search engine optimization and not I-chop-a-text-out-eat-it-Google! After all, other companies are also playing the ranking game and dutifully updating the content. Then why would the search engine show your 5-year-old text first?

Start by researching your target topics and keywords with tools like AHrefs or Ubersuggest to find topics and keywords that are not too difficult buy have a high monthly search volume.

If you’re new to SEO, start with the basics. Then you also avoid the gross SEO mistakes! For example, we have a free eBook that you should download to understand the basics of SEO or search engine optimization!

Founder at Datacrypt Ltd
Tim is the founder of Datacrypt.io and is an expert in web design, SEO, eCommerce, and content management. Tim loves to give you a lot of useful information in our blog about various topics.
Tim
Latest posts by Tim (see all)
Need Help? Chat with us!
Start a Conversation
Hi! Click one of our members below to chat on WhatsApp
We usually reply in a few minutes