Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett

    | Brett-S
    0

  • Hi there, We are in the process of moving 2 sites with higher page authority to another site we own (that is our company brand), so essentially 3 sites into one. We're at risk of losing a lot of SEO from the original 2 sites that have all the product information. We are doing this since we merged companies a couple years back and need one web precense. Anyhow, the site launch date is in 3 months and the recommendation is to start moving content over prior to that for top pages, which is a big undertaking when we are launching all the pages again with new content, redeisgn and moving sites in 3 months. If it's the right move, we should do it, but I just wanted to get opinions on how others have handled something similiar when moving to a site with lower site authority and trying not to lose rankings.

    | lauramrobinson32
    0

  • I had previous asked this question, where the issue turned out to be that I didn't have all the URLs in Google Search console.  Whoops! So I have added 4 properties that are really all the same property: https:// https://www http:// http://www I have added all of these.  This has raised a few more questions: Can I get Google Search Console to treat these (and even group these together) to show as one property?  Right now they are all listed separately.  I know in Site Settings you can set a Preferred Site.  Even so, they show as separate sites with data separately.  Can I merge these? What about Moz?  Should I do something similar to see traffic for each of these in Moz?  It looks like we are missing a ton of info.  Does Moz get this from GSC automatically? What about sitemaps?  Can I fix this in sitemaps?  Do I need separate sitemaps for each property?

    | TapGoods
    0

  • I have a client who is wanting to target searches for competitors products. His idea was to purchase domains related to the searches he's targeting (for example, people looking for another company's app) and to build out one page websites addressing the search query and why a customer would choose his app solution over a competitor. I know he'd have to build a handful of links to each site for any chance of success but I wanted to ask the following.. Would doing this be better than just building pages addressing the searches on his main website domain? Is there an SEO risk to doing this? Potential for a penalty? Anything we need to do to structure these in a way that won't violate Google's SEO guidelines? Any other thoughts on pros and cons of each strategy? Thank you! Ricky

    | RickyShockley
    0

  • Recently started working with a large site that, for reasons way beyond organic search, wants to forward internal pages to a variety of external sites. Some of these external sites that would receive the content from the old site are owned, admin'd and/or hosted by the old site, most are not. All of the sites receiving content would be a better topic fit for that content than the original site. The process is not all at once, but gradual over time. No internal links on the old site to the old page or the new site/url would exist post content move and 301ing. The forwarding is mostly to help Google realize the host site of this content is not hosting duplicate content, but is the one true copy. Also, to pick up external links to the old pages for the new host site. It's a little like domain name change, but not really since the old site will continue to exist and the new sites are a variety of new/previously existing sites that may or may not share ownership/admin etc. In most cases, we won't be able to change any external link pointing to the original site and will just be 301ing the old url to the contents new home on another site. Since this is pretty unusual (like I wouldn't get up in the morning and choose to do this for the heck of it), here are my three questions: Is there any organic search risk to the old site or the sites receiving the old content/301 in this maneuver? Will the new sites pick up the link equity benefit on pages that had third party/followed links continuing to point to the old site but resolving via the 301 to this totally different domain? Any other considerations? Thanks! Best... Mike

    | 94501
    1

  • We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.

    | vivekrathore
    0

  • Howdy, fellow mozzers. I got approached by my friend - their website is https://www.hauteheadquarters.com She is saying that they dropped from google index over night - and, as you can see if you google their name, website url or even site: , most of the pages are not indexed. Home page is nowhere to be found - that's for sure. I know that they were indexed before. Google webmaster tools don't have any manual actions (at least yet). No sudden changes in content or backlink profile. robots.txt has some weird rule - disallow everything for EtaoSpider. I don't know if google would listen to that - robots checker in GWT says it's all good. Any ideas why that happen? Any ideas what I should check? P.S. Just noticed in GWT there was a huge drop in indexed pages within first week of August. Still no idea why though. P.P.S. Just noticed that there is noindex x-robots-tag in headers... Anyone knows where this can be set?

    | DmitriiK
    0

  • So there's no way I could type out my thoughts / questions.   If you're bored, have a cup of coffee and some oreos, and are interested in a strange situation that I can't figure out the solution to, I'd appreciate your input: https://youtu.be/GomGOAdNens I should note ahead of time:  It really is important to me to do things "by the book" and "to the letter" with SEO.   So while there may be some gray areas, I would like to really do the one thing that is BEST in this situation.  Thanks for any input.

    | HLTalk
    1

  • Print pages on one of our sister sites are returning 404's in our crawl but are visible when clicked on.  Here is one example: https://www.theelementsofliving.com/recipe/citrus-energy-boosting-smoothie/print Any ideas as to why these are returning errors? Thank you!

    | FirstService
    0

  • Howdy guys, I'm wondering if AMP is worthwhile intergrating into a responsive e-commerce site? I'm under the impression that the benefits of AMP would be focused around speed, however it may come at the cost of conversion rate if it was to be delivered for product pages, etc. I'm presuming that even if AMP was on every page across a responsive ecommerce site, Google would only display AMP pages in the carousel for news articles, such as on the integrated blog? Any advice would be awesome! Thanks guys 🙂

    | JAR897
    0

  • Spam links showing up on non-www version only. Why is this? And is it still important to remove these links even though the preferred domain is www?

    | LinkRightMedia
    0

  • I have a custom website where if you type in companyxyz.com/_any-made-up-url _it displays the homepage. So then you will see the homepage and in the URL bar the made up URL path remains visible "companyxyz.com/any-made-up-url" Is this good or bad or not an issue?

    | Rich_Coffman
    0

  • Hi, I have about 5,000 new URLs to publish. For SEO/Google - Should I publish them gradually, or all at once is fine? *By the way - all these URLs were already indexed in the past, but then redirected. Cheers,

    | viatrading1
    0

  • 1)  I've been told that other sites linking to my site with keyword-rich text are bad. 2)  But Google Console / Analytics shows that we rank extremely high for random, pointless phrases loosely tied to the topic of our site.  Like "dht blocker".  (its a hair loss site) 3)  This week I began analyzing our backlinks.  Guess what I found?  Literally hundreds of bot-created spammy trackback and pingback text links around the phrase "dht blocker" It seems to me that keyword rich anchor text on external sites is NOT a bad thing.  In fact its an outstanding way to rank better for your desired keywords.  Obviously the "bad" is the spam element.  Probably the high quantity.  On unrelated websites.  But guess what?  It worked.  _We are ranking extremely well for these pointless phrases, thanks to these spam bots.  _ Obviously we will be disavowing all these sites.  But I want to start building quality links via legitimate, honest means. So here is my question: If I begin a legitimate honest link building campaign with other websites, and request that they put the HREF around our most coveted keyword phrase - is this inherently BAD?  Or is it actually possibly GOOD? ----------------------------------------------------------------------------------- Thoughts?

    | HLTalk
    1

  • Hi Mates, Currently we are using on our site two tags for language (we are targeting english ) ....   and these are defined on the head section, my question is it is required by Google in order to rank well or it is deprecated. Thank you Claudio

    | ClayRey
    0

  • I have url that is redirecting three times. How do I fix that? I am using WP and running Really Simple SSL plugin. Is it ok to use the redirect plugin also? See here what the problem is: http://www.thepatrickmullin.com/ redirects to https://thepatrickmullin.com/ through a redirect chain. This hurts your rankings http://www.thepatrickmullin.com/ -->
    https://www.thepatrickmullin.com/ -->
    https://thepatrickmullin.com/ Any help would  be greatly appreciated!

    | pmull
    0

  • Hi there, We've recently took on a new developer who has no experience in any technical SEO and we're currently redesigning our site www.mrnutcase.com. Our old developer was up to speed on his SEO and any technical issues we never really had to worry about. I'm using Moz as a tool to go through crawl errors on an ad-hoc basis. I've noticed just now that we're recording a huge amount of duplicate content errors ever since the redesign commenced (amongst other errors)! For example, the following page is duplicated 100s of times: https://www.mrnutcase.com/en-US/designer/?CaseID=1128599&CollageID=21&ProductValue=2293 https://www.mrnutcase.com/en-US/designer/?CaseID=1128735&CollageID=21&ProductValue=3387 https://www.mrnutcase.com/en-GB/designer/?CaseID=1128510&CollageID=21&ProductValue=3364 https://www.mrnutcase.com/en-GB/designer/?CaseID=1128511&CollageID=21&ProductValue=3363 etc etc. Does anyone know how I should be dealing with this problem? And is this something that needs to be fixed urgently? This problem has never happened before so i'm hoping it's an easy enough fix. Look forward to your responses and greatly appreciate the help. Many thanks, Danny

    | DannyNutcase
    0

  • Hi I'm researching competitor backlinks & they have a lot of directory links which are no follow - but they rank very well. Is this type of link building even allowed by google? I know they they aren't allowed followed directory links, but will no following them help with rankings?

    | BeckyKey
    0

  • Hi I have seen a big drop in a keyword going from position 3 to out of the top 100. The only thing I can see that went wrong, was an issue with broken images - could this be the reason for the drop? Becky

    | BeckyKey
    0

  • Hey, During the year I have done everything in my power to please Google with my website. Instead of building links towards the page I have focused on content, content and content. In addition I have worked with https and page speed. Today my site is faster than 98% of all tested sites in Pingdom tools and have 94/83 in Google insights. Of course we have had to build some links as well, perhaps 50 links in 8 months. At the same time we have built 700 pages of text. The total amount of links build is 180 over 20 months. On Thursday last week it looks like the site was penalized by Google. I still believe that we can do something about it and get the site back on track again. Hence we have been looking at technical things on the site, if there is anything Google don't like. One thing that I have found is structural data. For some reason this has dropped from 875 a month ago to 3 today. I have no clue why. Does anyone know how structural data works and what can have caused this problem. Would it be possible that we in our attempt to optimize the site might have done something that may affect the structural data? http://imgur.com/a/vurB1 In that case, what affect might this drop in structural data mean for SEO. Could that be a reason for the total drop in ranking? (we have basically been wiped on all our keywords) What I can see in Google webmaster tool about 975 pages are still indexed in Google which has been stable for a long time. Does anyone know more about structural data and what I can do about this?
    Thanks in advance! /A vurB1

    | Enigma123
    0

  • When looking at my backlinks if I see something like this: www.domainPizza.net
    www.domainPizza.com
    sub.domainPizza.com
    www.domainpizza.org
    domainPizza.net
    https://domainpizza.com
    https://www.domainpizza.net What is the actual list of disavows that I put into the file if I want to disavow this domain?  I am seeing so many variations of the same domain. Thank you.

    | HLTalk
    0
  • This question is deleted!

    0

  • Not sure how to handle this one.   Simply because there are SO MANY .... I want to be careful not to do something stupid ... Just a quick 3 minute video explanation:  https://youtu.be/bVHUWTGH21E I'm interested in several opinions so if someone replies - please still chime in. Thanks.

    | HLTalk
    0

  • We have a 302 redirection on some of our pages which involved login/account pages. So, some pages are 302 (temporarily) redirected to the login pages which is common especially in e-commerce sites (see screenshot). For SEO practices, what would be best to address this (if this an issue)? a. Block the login/account pages using robots.txt? b. Block the login/account pages using meta noindex? c. Leave them as is since it's a non-issue. d. Other recommendations, please specify in the answers.. Thanks! 2S9xn

    | jayoliverwright
    0

  • Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard

    | seoman10
    0

  • Hey all, When I run crawl diagnostics I get around 500 medium-priority issues. The majority of these (95%) come from issues with blog pages (duplicate titles, missing meta desc, etc.). Many of these pages are posts listing contest winners and/or generic announcements (like, "we'll be out of the office tomorrow"). I have gone through and started to fix these, but as I was doing so I had the thought: what is the point of updating pages that are completely worthless to new members (like a page listing winners in 2011, in which case I just slap a date into the title)? My question is: Should I just bite the bullet and fix all of these or should delete the ones that are no longer relevant? Thanks in advance, Roman

    | Dynata_panel_marketing
    1

  • Hello here. Our website, virtualsheetmusic.com, is pretty popular in the sheet music realm, and we used to rank on the first page for the keyword "violin sheet music" until a few weeks ago with our violin dedicated page: http://www.virtualsheetmusic.com/downloads/Indici/Violin.html But a couple of weeks ago we dropped to over the 5th page on Google (I can't even find us!) and I have no idea why. Most of our top ranking pages are still there though. This never happened before, after 17 years on the web. Do you have any idea why that could have happened?

    | fablau
    0

  • Hi all, hope you're all good and having a wonderful Friday morning. At the moment we have over 20,000+ live products on our ecomms site, however, all of the products are using non-seo friendly URL's (/product?p=1738 etc) and we're looking at deploying SEO friendly url's such as (/product/this-is-product-one) etc. As you could imagine, making such a change on a big ecomms site will be a difficult task and we will have to take on A LOT of content changes, href-lang changes, affiliate link tests and a big 301 task. I'm trying to get some analysis together to pitch the Tech guys, but it's difficult, I do understand that this change has it's benefits for SEO, usability and CTR - but I need some more info. Keywords in the slugs - what is it's actual SEO weight? Has anyone here recently converted from using parameter based URL's to keyword-based slugs and seen results? Also, what are the best ways of deploying this? Add a canonical and 301? All comments greatly appreciated! Brett

    | Brett-S
    0

  • Should I expect my rankings to do any weird things over the next couple of days or weeks?

    | Edward_Sturm
    0

  • I have multiple clients whose Google Analytics accounts are showing me that some of the top performing organic landing pages (in terms of highest conversion rates) look like this: /cart.php /quote /checkout.php /finishorder.php /login.php In some cases, these pages are blocked by Robots.txt. In other cases they are not even indexed at all in Google. These pages are clearly part of the conversion process. A couple of them are links sent out when a cart is abandoned, etc. - is it possible they actually came in organically but then re-entered via one of these links which is what Google is calling the organic landing page? How is it possible that these pages would be the top performing landing pages for organic visitors?

    | FPD_NYC
    0

  • I'm working on driving trials for our product - we have a number of blog posts that rank on page #1 of Google, and we get 2-3 trial sign ups per day from them. I'd like to put trial signup boxes about midway down each post to see if I can increase the number of trial signups that come directly from our blog. Do you think I can be "penalized" for this, since it's mid- blog-post content? Do you think Google will view this negatively?

    | Karibeaulieu
    0

  • I am 100% new to the world of WordPress, more advanced CMS systems, and all the interesting things that have been developed over the last several years on the web. After installing WordPress I noticed options to allow pingbacks and trackbacks.  Upon researching these, I found it interesting that you can very quickly and easily identify spam wordpress blog sites that mention your article in a spammy fashion if you leave this option ENABLED.  Because you literally get a notification of new nonsense happening on the web that may affect you. Is this a logical, rational thing to leave enabled and use as one of many tricks to manage new additions to a disavow file?  It almost seems like a Godsend because you don't have to go looking for them.

    | HLTalk
    0

  • Hey Moz squad. I do some seo work for a multi location locksmith and garage door service. On our locksmith website - We just use regular word press landing pages for all of our different city cities. (btw if anyone has a good site architecture tips blog they know about, Send it my way!) So with our Garage door site, we bought the yoast local plugin. And it wants to make our locations Blog posts instead of pages. NOW what do you guys think. Does it matter? I know I have less control on how a blog post looks. But I'm just looking for different opinions. Thanks loves.

    | Meier
    0

  • Hi all, we've recently migrated a site from http to https and saw the majority of pages drop out of the index. https://www.relate.org.uk/ One of the most extreme deindexation problems I've ever seen, but there doesn't appear to be anything obvious on-page which is causing the issue. (Unless I've missed something - please tell me if I have!) I had initially discounted any off-page issues due to the lack of a manual action in SC, however after looking into their link profile I spotted 100 spammy porn .xyz sites all linking (see example image). Didn't appear to be any historic disavow files uploaded in the non https SC accounts. Any on-page suggestions, or just play the waiting game with the new disavow file? Hku8I

    | CTI_Digital
    0

  • Hi Does anyone have any great examples of an ecommerce site which has great content on category pages or product listing pages? Thanks!

    | BeckyKey
    1

  • I have a custom made blog with boat loads of undesirable URLs in Google's index like this:
    .com/resources?start=150
    .com/resources?start=160
    .com/resources?start=170 I've identified this is a source of duplicate title tags and had my programmer put a no index tag to automatically go on all of these undesirable URLs like this: However doing a site: search in google shows the URLs to still be indexed even though I've put the tag up a few weeks ago. How do I get google to remove these URLs from the index? I'm aware that the Search Console has an answer here  https://support.google.com/webmasters/topic/4598466?authuser=1&authuser=1&rd=1 but it says that blocking with meta tags should work. Do I just get google to crawl the URL again so it sees the tag and then deindexes the URLs? Or is there another way I'm missing.

    | Rich_Coffman
    0

  • We are using Google Search Console to monitor Crawl Errors. It seems Google is listing errors that are not actual errors. For instance, it shows this as "Not found": https://tapgoods.com/products/tapgoods__8_ft_plastic_tables_11_available So the page does not exist, but we cannot find any pages linking to it. It has a tab that shows Linked From, but if I look at the source of those pages, the link is not there. In this case, it is showing the front page (listed twice, both for http and https). Also, one of the pages it shows as linking to the non-existant page above is a non-existant page. We marked all the errors as fixed last week and then this week they came up again. 2/3 are the same pages we marked as fixed last week. Is this an issue with Google Search Console? Are we getting penalized for a non existant issue?

    | TapGoods
    0

  • Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate

    | KateWaite
    0

  • After having read this thread, the answer seems to be a tentative "Yes", but I am curious if I am doing this wrong, or causing myself problems, for a specific situation. We have a thread on the forums that has over 50,000 views for that thread alone.  No doubt many people have linked to it across the web, and it ranks very well with Google.   But we are dealing with a major problem in that the main portion of our site (home page and core content) which are the most important, aren't ranking in Google at all. A big part of this is because that part of the site hasn't been updated in years, whereas the forum is updated daily. By users. We've begun putting out quality content in our News Center lately, and hoping to start boosting its presence in Google.  We have an article on the exact same topic that the forum thread covers.  I was thinking of putting a canonical on that thread, pointing to the article, and hopefully pointing some very powerful link juice, popularity, and traffic into our news center articles.   People can comment there as well if they like. Are there any potential downsides to doing this?  My hope is that the forum thread loses rankings and the article takes on its rankings. Thank you.

    | HLTalk
    1

  • Is there a way to add versioning to an xml sitemap? Something like <version>x.x</version> outside of the <urlset>?</urlset> I've looked at a bunch of sitemaps for various sites and don't see anyone adding versioning information, but it seems like it would be a common issue - I can't believe someone hasn't come up with some way to do it.

    | ATT_SEO
    0

  • I have followed some of the discussions about Yoast and the news plug-in, but have not found specific information about the use of meta properties. One of our competitors is successfully using about 15 meta properties to gain news ranking. They list the publisher as Facebook. Is this coding part of the Yoast package or hard coding? As an example:

    | jgodwin
    0

  • My website has a login that has HTTPS pages. If the visitors doesn't log in they are given an HTTP page that is similar, but slightly different. Should I sure a Rel Canonical for these similar pages and how should that be set up? HTTP to HTTPS version or the other way around? Thank you, Joey

    | JoeyGedgaud
    1

  • My questions are in regards to Google being able to index PDF's and Images that are viewed from a document reader. The website offers visitors that login to view a variety of documents including PDFs and images. Will Google be able to find and index these documents so they can show up in search results? Will the PDFs show up in search results and will the images show up in search results? Will Google be able to find and index these documents so they can show up in search results? Will the PDFs show up in search results and will the images show up in search results? Will the PDFs show up in search results and will the images show up in search results? How does the document reader affect the SEO of the website? How does the login affect the SEO of the website? Thank you, Joey

    | JoeyGedgaud
    0

  • I've got a client that only shows pricing if a user is logged in - they're B2B and only sell at a wholesale level. The site is massive, has been around for about a decade, and has had an active SEO campaign for years. They've been losing ground on top ranked keywords, primarily in the 1-2 spots, rest of the first page remains strong and actually improves regularly.My hunch is that Google recognizes the inability for anyone to make a purchase on the site. As a result, they're realizing that the searcher intent doesn't match the actions that can be taken on the site and are bumping them down. Has anyone seen a similar situation or have any evidence to suggest my hunch is correct?

    | LoganRay
    0

  • Question. If you wanted to use the Google Demote this sitelink URL - will another sitelink appear to replace it? There is a sitelink I feel doesn't belong and hoping another one that is beneficial appears. I understand Google has control of what appears. https://support.google.com/webmasters/answer/47334 Thanks.

    | Kdruckenbrod
    0

  • Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
    (a) http://example.com/sitemap.xml 
    http://example.com/sitemap-chocolatecakes.xml
    http://example.com/sitemap-spongecakes.xml 
    and so on... OR this kind of approach - 
    (b) http://example/com/sitemap.xml
    http://example.com/sitemap/chocolatecakes.xml and 
    http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke

    | McTaggart
    0

  • My website (a large professional one) uses a interesting menu system.  When a user hovers over text (which is not clickable), then a larger sub-menu appears on the screen, when they hover over something else, then this sub-menu changes or disappears.   This menu is driven by a hash(#), which makes me wonder.  I this giving my sub-pages an SEO kick? Or... is there another way that we should be doing this in order to get that SEO kick?

    | adamorn
    0

  • I'm using the Google Structured Data Testing Tool to test: https://search.google.com/structured-data/testing-tool NY Times and Women's Health being two good examples. These two reputable publishers don't seem to have the microdata they've implemented recognized. Are they doing something wrong or is there a problem with the tool?

    | Edward_Sturm
    1

  • I am looking at a website and have noticed that there are lots of photos living on different domain - so I imagine they're coming through from another website - e.g. the domain I'm looking at is www.chocolatecakeszoopla.com - the images on that domain name feature the third-party website's url - e.g.: www.chocolatecakestockimages.com/chocolatecakeicing.jpg - is this anything to worry about? I was imagining the pics would feature the same URL as the rest of the website - that would be more logical? Would it be better practice to amend image names to feature the URL of the site they appear on, or doesn't this really matter? Thanks, Luke

    | McTaggart
    0

  • Does it do any good to use hreflang on links without rel="alternative" ? We have on each page a possibility to go to another language, but the languages root page and not an alternative version of that specific article.

    | Preen
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.