Digital marketers also refer off-page SEO to,
- Off-site SEO
- Off-page activities
- Citations to website
This article covers
- The Definition of Off-page
- Types of links & objectives
- Types of Bots
- Types of Websites for Off-page
- Why Off-page SEO has to be done for a website
- Picking right anchor text
- Is off-page needed for credible websites or content rich websites
- What is the alternative in case if building backlinks are not possible
- Off-page explained technically
- Picking the best websites for Off-page /Criteria in finalizing the websites
- Tools used for Off-page
- Do’s and Don’t in the off-page SEO
- Understanding the benefits of Off-page SEO
- The easy way of shortlisting the best domains for your website
- What are spam links
- Finding the spam links
- Action on the spam links
What is an Off-page SEO:
The activities performed outside of your website/web pages and on other websites, which in turn can help your website performance on search engines or help to get more referral & social traffic to your website.
- It can be a link to your website from other websites, ”A hyperlink with an anchor text or no anchor text”.
- Just listing of the brand/your domain name.
- Image post linking back to your website. (usually called badges).
Terminologies in Off-page SEO
- Trust Flow- The quality of backlinks and the trustability of the website providing links to your website is measured as trust flow
- Citation Flow- The number of links flowing to your website through other websites is measured as citation flow
- Domain Authority- The website crawl frequency, number of backlinks and the trust is measured by a number and is been called as domain authority
- Page authority- The webpages crawl frequency, number of backlinks and the trust flow is measured by a number and is called as page authority
- Backlinks- The links coming from external website to your website are called as backlinks
- Anchor Text- The text used in the anchor tags as an information to bots and users about the destination page.
- Anchor tags- The HTML tags those direct you to a planned destination page through the anchor text
- Hyperlink- A link that helps you in landing to the destination page
- No-follow & Do-follow- Based on the crawl-ability of your website through external links the backlinks are categorized as Do-follow and No-follow, bot’s won’t be able to crawl your website through No-follow links and they can crawl your website only through do-follow links
- Referring domains- The domain that provides a backlink from any of its web pages is called to be as the referring domain
- Referral traffic- The traffic that’s coming through other websites either through do-follow links or no-follow links is measured as referral traffic
- Citations- The backlinks or references to your website from other websites are called as citations
Types of Bots you should be welcoming to your website:
As a webmaster or SEO professional you should have understanding of this as you cannot consider building links to your website just like that, any activity done without understanding the types of bots would affect your website negatively and there are very high chances of search engines completely blacklisting your websites for further crawl and index of your website and hence you have to be wisely decided in picking up the websites to get the backlink from.
Types of Bots:
- Web Bots
- Mobile Bots
- Image Bots
- Video Bots
- Region Specific Bots
- Language Specific Bots
- News Bots
You cannot invite a bot from non-English speaking website to an English speaking website especially if those languages are segregated at a country level.
India has multiple languages and there is very high possibility of links from any Indian language such as Hindi, Telugu, Kannada, Tamil, Malayalam websites to an English version of Indian website
English version of Indian website receiving links from Spanish website or French or some other language where neither of them is the Indian language and these kinds of links are suspicious especially if your business is specific to a country such as India, in case of global presence you should still prefer guiding bot to the web pages that have language specific to those regions or TLDs specific to those regions.
Inviting news bots may not be a problem if you have taken care of language, as it shouldn’t look suspicious to bots. Also, you have to make sure your website has lots has fresh content in case if you are getting links from news websites consistently.
Inviting Image/Video bots, you can always invite these bots but make sure you are serving something relevant, the most relevant content for them would be a repository of images/videos or just an image or a video.
Most of the web pages of the website will have images and hence bots find value in crawling your web pages, but then such bots visiting would webpage would only prioritize the images or the videos, this doesn’t mean they don’t help to crawl the text, they do help but these kinds of bots may not help the way a web bot or a mobile bot does.
Types of Off-page based on Objectives:
- The objective is to drive referral traffic:
The off-page activities are done to drive only traffic to the websites from the external websites and in this case,
- The anchor tags include an attribute rel=”no-follow so that only the actual users can click and visit the website and bot never visits the website if the anchor tag has an attribute rel=”no follow” and hence the objective here is to drive only traffic to the website.
- The links are considered to be as no-follow links
- g: An activity done on social medium platforms would only drive real visitors to your website but not allow bots to visit and crawl your website through them.
2. Objective to Improve Crawl stats:
Every website that’s instructed to crawl or has an webmaster account communicating to bots for crawl will have the bots crawling them at different intervals dependent on the,
- Frequency of content publication
- Brand value
- Type and quantity of links pointing to them
And in this case, if we are building any backlinks from those website which are getting crawled by Google bots, the Google search bots visits our websites as well if there exists a backlink with the rel=follow attribute in the anchor tag.
- The anchor tags include an attribute rel=”follow” or no rel attribte at all which mean the same, so that both bots and real visitors can visit your website
- The links are considered to be as do-follow links, where bots crawl your website as well.
- G: An activity done on the blog, where blog describes about your business and provides a reference as your website with the rel=follow in the anchor tag.
Types of websites those can be used for Off-page SEO:
The websites hosted on the internet can be categorized into various types based on the objectives of building those portals or based on the type of content and frequency of content published on those portals.
I am not discussing much about the social platforms or the paid marketing and I don’t consider them to be part of Off-page SEO, as they can get only traffic to the website and can never pass the link juice (Crawl) through them and hence let's keep this away from off-page SEO.
The Type of websites for the Off-page SEO are as follows:
Blogs are the content portal on which the value-added content usually related to a specific category and in some cases pertaining to various types of categories gets published frequently or at irregular intervals of times.
The published content usually will be very informative or value added to the user and also the content encompasses of the low competition search variations and hence Google gives lots of importance to it and prioritizes for crawling and building a do-follow backlink link from such kinds of websites will be very beneficial.
E.G: Kandra.pro/blog has lots of value-added content and building a link from Kandra.pro to abc.com would be very useful for abc.com.
Blog: WordPress, Blogspot, Tumbler, blog and more.
Articles: Articlemonkeys, sooperarticles and more.
A content of 200-2000 words usually or any word count can be written about the category, topics, business, addressing a problem, a casual talk or anything interesting/could be the personal experience with or without references in the content can be considered as the blog.
There are specific types of websites whose objective is to act as a reference or directory for a specific category or categories of businesses with some basic information and this had, in fact, was a reference to search engines as well in the past.
Back to a decade (between Year 2000-2012) most of the directories used to provide do-follow links, but then since 2012 may have made all of their outgoing links as no-follow, meanwhile the content on these kinds of portals used to be a copy and search engines were actually able to find it by themselves and hence started deprioritizing the directories and hence the benefits we reap from directors are actually minimum if the objective is to invite bots to your website through them.
These directors serve the purpose of driving the actual users and there is no SEO advantage from directories.
E.G: Yellow Pages, Sulekha and more.
As a service provider of that area, you can always do a listing on such websites and provide links back to your website so that the actual users can visit your portal.
Social Bookmarking websites are the one those syndicate all the available pieces of your content on other websites across the internet. To keep it simple a content submitted for the social bookmarking website is displayed/syndicated on many other websites by these social bookmarking websites.
In this case, if the bots consider valuing the bookmarking website there are chances of they crawling your website as well through bookmarking sites. But even these websites have made all of the links published on them as no-follow and hence there is no actual boost from SEO perspective except for the social/referral traffic.
Digg, Delicious, Reddit and more are some of the examples of Bookmarking websites
The websites which are the platforms/communities for the discussions where anybody can raise a question and the community can answer to the questions raised, here the content that’s getting published as question or an answer can encompass links pointing to other websites and if the links are do-follow there is a benefit for the website which has incoming links from these.
In the case of No-follow, again it's only a real user visit to the website and bot never considers crawling through them.
E.G: Quora, Bharatmoms and more
The forum websites are places where people ask for something they want to know about.
And you can always answer those questions and list your website either with a hyperlink or no hyperlink whichever the forum website permits you to do.
Business websites main objective is to make a business by selling a product or service and usually they have only the listing of products or services and these websites focus more on the products catalog and the content published on it is most of the times specific to their brand or might not have content at all and are hence building links from these kinds of website may not be that helpful
In case if these kinds of websites have lots of value-added content published you can consider building a backlink from them, but never have a reciprocal link. Reciprocal links on the business website look suspicious to bots.
News websites are portals those will have bots visiting them very often sometimes few tens of times in a minute as the bots visiting them are the news bots, news bots are like any other bots but then they give a special importance for the websites those are accepted as news sites by search engines.
News website to be registered as news sites has to fulfill the Google search engines criteria, and the list, in brief, is as follows,
- Sites should be publishing an informative and news kind of content every day at least one article
- There shouldn’t be any duplicate content on the portal
- There should be a proper segregation of news and nonnews content on the portal
- Any other website picking content from this website should either use a canonical or syndication tag and the syndication tag is the most relevant one.
- There should be a news XML sitemap built, new XML sitemaps built should follow or satisfy the criteria suggested by Google
One can always consider building links from the news website in case if satisfied the following
- A website receiving the backlink from the news portal has a regular update of content on it (Fresh content)
- Links are made like to look natural
- Proper anchor text usage
- Considering the right language and region in picking the news website
- Duration of backlink
- Link type
- Static links
- Link across website
- Article link and more
In case of Do-follow links, these links will be considered to be as the most trusted links and the citation flow of it also will be very high
There are tons of image portals on the internet and have bots visiting them very often and building the link from these websites to images or image pages of our website may be helpful if the conditions are satisfied in terms of content type, language, region and more.
Image Sites eg:
Pinterest, Image Fave and more.
You can post an image representing your website Brand and can hyperlink back to your website if images sites permit this.
Website Of Very High Brand Searches:
There can be many other types of the website which would not fall into any of the category mentioned above, but then any link pointing from such kinds of the website may not have any kind of harm.
E.G: Bank Portals
Bank portals are often visited by users and many times they would reach these kinds of portals by searching with the brand name, such as HDFC bank login, HDFC bank login page and more, which typically would happen few million times a month, and hence bots prioritize crawling these websites as users are querying Google.
Google prioritizing crawling is nothing but bots visiting these websites very often to crawl them and any link placed on these websites to our website may be really helpful if the precautions are taken and made it look natural.
Picking the Anchor text for Off-page:
Anchor text plays a crucial role in off-page SEO and also make sure you are taking care of the following,
<a href=https://kandra.pro>Digital Marketing Consultants</a>
The red text in the anchor tag is the anchor text.
- Understand the product and its search variations and based on the same pick the anchor text
- Do not use the same anchor text for all the backlinks
- Don’t use anchor text related to one single page, use anchor text for different products/services/content and make sure there is an even distribution across your web pages.
- Don’t have an empty space or blanks as an anchor text
- Don’t have dots or some other characters for anchor text, always prefer an understandable text
- Don’t use numbers until and unless it a part of pagination or necessarily needed.
- Make sure the anchor text you pick is grammatically correct and doesn’t have any spelling mistakes
What happens with wrong anchor text:
- The backlink profile would look suspicious
- User may not understand the destination content and hence may not visit the page that’s published in the anchor tag
Why Off-page SEO has to be done for a website?
Answering or understanding why off-site SEO necessarily needs an understanding of the search engines functionality, below is a brief explained in simple about the search engines functionality and its objective,
How search engines work and show exactly the results one meant to be are served?
Answering to this is a bit technical, but then it been simplified and explained as follows,
You can read more on "How exactly search engines function"
The search engines work exactly the same way like the search on the menu for some files saved on your computer/machine work, but on an internet server, this happens very intelligently, primarily because of the following reasons.
- There exist millions of similar files with the same name.
- The Privacy matters
- There is a huge competition
- Relevancy by time and content to a bigger extent always matters
- The cost or search engines bot/servers bandwidth matters.
So the search engines to serve an appropriate file to the user need to check with several factors that necessarily to be accounted consider the one mentioned above.
Search engines serve the pages in the top of the searches when their database is frequently updated and they have a deep understanding of that particular webpage.
The better or deeper understanding of the content happens only when they read/crawl the webpage or content number of times in order to fulfill the objective of serving the most recent/fresh content and the relevant content from the internet.
To have a better understanding of the webpage content google spiders very often visit the web page and crawl all the content and then updates their database accordingly. Frequent visiting by spiders happens only when the web pages are very often updated with new content, but then practically this is not possible for 90% of websites on the internet server especially who don’t update content often on their websites, but still, webmasters want the spiders to come and crawl their websites.
So to improve the crawl frequency of their websites by spiders without any new content, they intend building backlinks from various kinds of websites as listed above.
Is off-page compulsory?
Is off-page really needed for credible websites or content rich websites is a question that’s been bothering many webmasters since decade or more. I would say Off-page is not compulsory and it is compulsorily needed for your website only when,
- Your website doesn’t have frequent content updates
- If your business is highly competitive on the internet
- You intend to see quick results, you may not have time to wait for three months
- When your internal linking is poorly built on the website
- When you are not a well-known brand
- When you are very less traffic from external websites and if you intend to increase it
What is the alternative to Off-page SEO?
I have come across so many questions from my trainees, clients, and peers, and their questions are as follows,
- How to avoid building backlinks
- What’s the best alternative of Off-page SEO
- When Can I stop building backlinks/Off-page SEO
- SEO without Off-page/Backlinks
I don’t really suggest not getting backlinks, you should have links built from other websites and its needed for every website, but meanwhile, you cannot do it forcefully, you have to acquire them naturally by creating a value for the user who does that.
The alternatives for the off-Page SEO can be as follows:
- Branding activities such as TV ad’s, Radio ad’s, Hoarding ad’s or any offline ads may be equivalent to building few hundreds of credible backlinks to the website and the branding activity improves the organic rankings of a website by a lot.
- By branding, you will start seeing most of your nonbrand keywords as well coming to the top on SERP
Read my article on How branding helps to improve SERP of your non-brand keywords,
Publishing lot of value-added content on regular basis and inviting bots
- If you are not building backlinks you can consider publishing content on a regular basis and you can invite bots through fetch as Google from your Google webmaster account.
- Fetch as Google for the newly written content is equivalent to building a backlink for that article if you continue publishing lots of articles and fetch as Google every time that would be equivalent to building a link every day or every time you publish a content.
Adjusting crawl frequency for the website/web pages.
- Through sitemaps.
- Through settings on webmaster account.
Work on Website internal Linking:
There is a possibility of your website having great craw frequency mostly bots visiting the home page of any other page on the website and just that kind of bot frequency would be good enough sometimes and you just have to make sure that as many pages possible are made accessible to bots or make sure that the bot bandwidth/energy is evenly distributed across many web pages of your website and that will happen only through good internal linking.
Note: Branding and publishing content may help crawl frequency of your website, but there is nothing wrong in improving that crawl frequency further by inviting bots from many other relevant websites. And, this actually helps you to see a quick progress in your keywords SERP.
So now a question comes, which kind of websites would be good to build a link from?.
Since the concept is clear for us, we need to prefer the websites for which the spiders visit very often and then get links from them.
So the very often Google spiders visited/Visiting websites are generally the one on which the content updates happen very frequently or the one that has good brand searches or the one that has too many backlinks.
Criteria in finalizing the websites for Off-page:
If you are struggling with finding the solutions for any of the following,
- Picking the best websites for off-page
- How to find the best websites for off-page
- Which kind of websites are best for off-page
I have set of details answering your questions and you can start practicing the same for your off-page SEO as well to see great results of your off-page SEO.
Whatever the category of the website it is you still have to consider set of metrics in order to get the backlinks built them, and the metrics list is as follows:
Let's assume we are going to build the links from the same region and language source websites as that of destination website and the following will be the criteria’s that should be considered in shortlisting the websites.
- The citation flow and Trust flow of the website or Domain authority (DA)
- Citation flow is a number to measure the number of backlinks a website has, higher the citation number better it is.
- Domain authority is another way of measuring the crawl frequency of the website and higher the domain authority better it is.
- Domain authority and citation flow both imply that the website has a higher crawl frequency.
- The organic traffic of the website
- The organic traffic of a website implies that the website has great crawl frequency and well optimized for search engines, building a link from such kinds of the website which are well optimized and has good organic traffic will be very advantageous.
- Websites hosted on Good servers/Dedicated servers & are loading quick
- Finding the website hosting and server performances may not be that easy and hence understanding the load times of the websites may be very useful. Better the load times better are the crawl stats of the websites and hence building links from such website may also be useful if they are qualified portals from all other perspectives.
- Internal link strategy of the website
- A website with poor internal links may not benefit especially when the links are built from content pages those get archived or become orphans without any internal links. Make sure that the website you are going to build links from has very good internal linking.
- Topical relevancy & Topical value of the portal
- Topical relevancy or topical value is the one that lets bots to consider your website for frequent crawling, lower the relevancy low the crawl frequency and hence consider building links from the websites those are highly relevant to yours.
- Topical relevancy is measured as trust flow by some of the tools on the internet
- The first crawled and last crawled of the website
- If the first crawled and last crawled remain very close to each other and far from the current date then the website may not be so useful as you may be getting backlinks from it very rarely or not at all since its last/latest crawled date.
Note: The details such as organic traffic, Trust flow, citation flow, Topical relevancy, and many others may not be easily available, you might have to rely on some tool for the data or manually check for them.
Tools and Tips for finding the metrics:
Similar web helps you in understanding the traffic details, you can find the traffic by country, by the device, and by sources. If you want to understand the organic source traffic you can rely on the similar web.
The similar web has an excel plugin, API and browser extension and hence it may be very convenient for the webmaster is they are relying on the similar web for data.
Majestic SEO is one of the biggest crawlers on the internet, which actually was next to the Google search crawler in the past. This tool will have lots of details about the websites and they segregated data accordingly so that it makes it very convenient for webmasters especially if they have to understand the details like the citation flow for the website, trust flow of the links/website, topical relevancy, crawl dates/times and more.
The free version of the tool provides very limited data and you might have to sign up for any of their plan based on your business requirement.
Moz.com Or Moz Tools:
If you are looking for domain authority you can rely on this tool, alternatively, you can refer to citation flow in understanding the DA, higher DA websites will have greater/Bigger DA and hence you can consider either of these.
Alexa.com & Spyfu.com
Alexa and spyfu are great tools to rely on if you looking for competitor details like traffic, the regional breakdown of traffic, source breakdown, referring domains data, keyword data, Alexa ranks and many more.
Tools.pingdom.com and Google speed test Tools
These tools are very helpful if you are trying to understand the website performance, the load times and page sizes.
Tips for Off-page SEO:
DO’s & Don’ts
- Build links from the relevant domains
- Consider DA/Citation flow of the website.
- Consider Trust Flow of the website from you are going to get the backlink.
- Consider Popularity/Brand value of the website.
- Maintain quality of content in case of articles/blogs.
- Use high searched queries in the content you generate for blog/article.
- In case of directory listing or local listing prefer appropriate category for posting.
- Maintain a network so that everyone come across your blog/article and spend time over on that page.
- Have distributed anchor texts.
- Never build links from low-quality websites.
- Don’t build links from Penalty websites.
- Don't duplicate the content in case of posting articles or blogs.
- Don’t use same anchor text across.
- Distribute links to other pages of your website.
- Don’t build links from network sites. (Hosted on the same Server).
- Don’t ruin the quality of content in case of blog/article.
- Don't go focused on using High volume keyword as an anchor text. Use distributed set of keywords and hyperlinks. (URLs).
How to measure the backlinks Performance or Off-page outcome:
How to understand if the backlink has started boosting your website?
How to know if the backlinks are helping in improving the crawl stats of your website
Many webmasters build backlinks to their websites in a very large scale by investing lots of their time and many times investing lots of money and hence understanding the ROI of the same is very important, but the questions come how?
Yes, many webmasters and SEO marketers struggle a lot in understanding the ROI of the off-page spend and finally they end up not answering this question or they just ignore.
As a digital marketer you should be understanding the way, process or steps involved in finding the outcome of backlinks, if you are keen in measuring the outcome of the SEO off-page here is a checklist you can follow,
Note: If referral traffic is the only advantage you are looking for through off-page you can always refer to those details on Analytics, any analytics tool can provide you the referral sources details. Here in this section, the actual outcome of the backlinks or Off-page is measuring as the crawl stats /Do-follow backlink from that specific website.
Usually, bots take some time in crawling website and they do visit websites once in a while or regularly based on the type of website or type of content website has based on the brand value, based on the backlinks a website has and many more.
Here are some of the amazing ways of measuring the backlinks progress,
- Measure by the crawl stats:
- If you see an improvement in crawl stats after a certain number of days it may be probably due to the external links built if in case if there is no change on the website from the content perspective, technical tweaks, brand value, and other implementations on website, typically everything remains same and only activity would be building backlinks. But this case is very rare, where the website will have lots of content updates, tweaks from the technical and SEO perspective and hence this is not a viable model to measure the backlinks outcome.
- Measure by the number of backlinks
- Google webmaster tools
Google webmaster tools “links to your site” section is a great way of understanding the backlinks and referring domains, you can always refer to this section to understand the number of backlinks a website/webpage is receiving from external portals.
But then the only disadvantage of this would be that GWT provides sample data of only recent 1000 domains and not more than that, and the sample data will be based on the recent links, and most number of links provider and hence the understanding outcome of all the external websites is not possible.
- `Majestic SEO Tools Or any other tool
Majestic SEO is one of the biggest crawlers and is next to Google in crawling the internet server, this tool is explicitly built to measure the backlinks and any details related to backlinks at a deeper level, you can find the complete list of domains, the number of links provided by them, the first date, last data, historical data, citation flow and trust flow of external website.
Webmaster tools and majestic SEO tool takes some time in understanding the crawl/outcome of the external websites and if you want to know immediately if they have started boosting your website here is a quick way of doing it,
If you have built a link from a specific website and if you want to understand is that helping you, you can simply look at the website cached version, as cache:website.com on your browser, and the browsers provide you the details of website crawled date and time with all its content in HTML, Text and source code format.
To understand this lets take an example,
Lets assume you have a website mysite.com and there is an external website yourwebsite.com which is providing backlink to mysite.com from its home page, lets assume the link went live on the yourwebsite.com today morning that’s on September 7th 2018 and if you want to start measuring right from the same day consider finding the cached version of the webpage, if in case if you find a cached version of the website with date and time as that of same day with time after publishing your link on that website then your link would have been probably crawled by the bots through yourwebsite.com, as yourwebsite.com has got crawled after your taking the link live. If in case if your find the cached version date and time to be previous day or any other day prior to your day of publishing link on yourwebsite.com then it implies bots haven’t still found mysite.com link through yoursite.com
Most of you would have heard of the word spam, penguin update and more around the impact of bad links before we get into this let's understand what are these,
- What are spam links/Spam Website
Any external links that don’t provide any kind of boost to your website and despite affecting the performance of your website, servers, crawl stats and leaving a malfunction information to bots can be considered to be as a spam link
- What kinds of backlinks are considered to be spam
- Too many links from the irrelevant portals
- Too many links from the irrelevant language portals
- Too many links from the regions where our business doesn’t exist or provided a valid information on why links from that specific region
- Links from portals of very thin content plagiarised content
- Links from the portals those got affected by the penguin updates
- Links from the portal that’s has irrelevant bots traffic
- Reciprocal Backlinks
- Why spam links have to be removed or Disavowed
Spam links may affect the crawl stats and server performance of the website or the website can be prone to penguin algorithm penalty and which in turn impacts the SERP’s of keywords or may remove the website completely from Google index.
- Ways of figuring out the spam effect
- Run for a language check
- If you find the backlinks from external websites which are no way related to your language you should take a call on such links as they might affect your website negatively.
- Top TLD checks
- Consider TLDs pertaining to your country, in case if your business has a global presence you can Ignore links coming from other countries. But make sure if you are local to your country don’t have links from other countries TLD’s this may look suspicious to search engines
- Domain Authority check, if it looks suspicious beyond 70 take action
- The DA of most of the domains will be less than 70 and there are very few portals on the internet which has DA greater than 80 or 90 and they are extremely great in terms of their business, revenues, presence, content and more. But meanwhile you might come across some of the websites which may not be great from any of the perspectives and still, their DA’s will be very high, you might have to manually audit their websites and understand them thoroughly in terms of their content, popularity, citations and more
- Do site:yourdomain.com, if it's not indexed there are chances of it being penalized
- Run for a language check
- The biggest effects of spam links
- Crawl stats fall down drastically
- Google will start dropping the cached web pages from its index
- Google would never let your keywords come to the better positions on SERP
- Website home page may get removed from Google index completely
- Action on spam links:
You can go and remove the links if those were built by you.
If those were not built by you or there exist a huge number of such links, send an attached file to the Google spam team through webmaster disavow menu and they will take care of that.
You have to download all of your external links either from your Google webmaster tools account or use tools like Majestic SEO or Ahref’s which can provide you a very detailed report of referring domains and you will be also able to find lot many details like the
TLD, IP’s, Subnets, Trust flow, Citation flow, DA, Language, category and many more and you just have to filter out the one as per the instructions in finding the spam domains. Once you are done with the list finalization you can now submit the same through Google webmaster tools, ask them to not to consider any backlinks from those domains provided in the list, that’s how you are going to safeguard your website from the spam links.