Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
| jeffchen0 -
Move Pages From One Domain To Another - The SEO Friendly Way
Hi All, One of our clients is a hair salon, that's currently dividing into two separate entities. For over 10 years the hair salon has been for both men and women, but that's now changing. The company is splitting into two, the original website contains pages for both men and women, but will soon only contain pages for women's hairdressing. The problem I have here is that there's probably around 20-30 service pages that get really great, targeted traffic on the men's side. There's a brand new domain for the men's hairdressing company and I'd like to know how you'd go about retaining the SEO value instead of just culling the pages. I'm thinking that we should maybe take the content from the original website, re-write it slightly to match the new brand, add it to the new website and then 301 the pages on the original website across to the new website. Has anyone had any experience in doing something like this before? and will the SEO value move across to the new domain? Also, I'm scared that the internal pages of the new domain may hold more authority than the home page and could cause problems. Any ideas on this would be great.
| Townpages0 -
Web Site Ranking
Hi Folks, I made some changes on my website www.gemslearninginstitute.com and published it two days ago. It was ranking on Google first page for a few keywords. I did not touch the pages which were ranking on first page. Since then I am not seeing the website ranking on the Google. Does it take a few days to rank again? How can I ensure that next time if I update the website or publish some blog on my website then it should not effect the ranking. Secondly, if I would like to rank in three different cities then do I need to create separate pages for each city or how should I proceed with this. Thanks
| fslpso0 -
Why is my Bing traffic dropping?
In the middle of September we launched a redesigned version of our site. The urls all stayed the same. Since site launch traffic in Google has steadily increased but Bing traffic has dropped by about 50%. Any ideas on what I should look at?
| EcommerceSite0 -
Help with Schema & what's considered "Spammy structured markup"
Hello all! I was wondering if someone with a good understanding of schema markup could please answer my question about the correct use so I can correct a penalty I just received. My website is using the following schema markup for our reviews and today I received this message in my search console. UGH... Manual Actions This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines. Site-wide matches Some manual actions apply to entire site <colgroup><col class="JX0GPIC-d-h"><col class="JX0GPIC-d-x"><col class="JX0GPIC-d-a"></colgroup>
| reversedotmortgage
| | Reason | Affects |
| | Spammy structured markup Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Rich Snippet Quality guidelines. Learn more. | I have used the webmasters rich snippets tool but everything checks out. The only thing I could think of is my schema tag for "product." rather than using a company like tag? (https://schema.org/Corporation). We are a mortgage company so we sell a product it's called a mortgage so I assumed product would be appropriate. Could that even be the issue? I checked another site that uses a similar markup and they don't seem to have any problems in SERPS. http://www.fha.com/fha_reverse shows stars and they call their reviews "store" OR could it be that I added my reviews in my footer so that each of my pages would have a chance at displaying my stars? All our reviews are independently verified and we just would like to showcase them. I greatly appreciate the feedback and had no intentions of abusing the markup. From my site: All Reverse Mortgage 4.9 out of 5 301 Verified Customer Reviews from eKomi | |
| | [https://www.ekomi-us.com/review-reverse.mortgage.html](<a class=)" rel="nofollow" title="eKomi verified customer reviews" target="_BLANK" style="text-decoration:none; font-size:1.1em;"> |
| | ![](<a class=)imgs/rating-bar5.png" /> |
| | |
| | All Reverse Mortgage |
| | |
| | |
| | 4.9 out of 5 |
| | 301 Verified Customer Reviews from eKomi |
| | |
| | |
| | |
| | |1 -
Google SERPs displaying Tracking Tags
Hello, I'm hoping someone can help me! Can you tell me why Google would be displaying the tracking URLs in the SERPs (screenshot - http://i.imgur.com/gbskD26.jpg)? I'm thinking it may have to do with the canonical URLs, but I'm not sure.. Thanks in advance! gbskD26.jpg
| Mindstream_Media0 -
How does Quick View windows affect SEO?
I have the following website that I am building (http://www.jetabout.ca/cruises/). All the items listed link to quick view pop ups. I was wondering how does this affect SEO and will Google be able to pick up on this?
| cholidays0 -
Page Authority
Hi We have a large number of pages, all sitting within various categories. I am struggling to rank a level 3 for example, or increase authority of this page. Apart from putting it in the main menu or trying to build quality links to it, are there any other methods I can try? We have so many pages I find it hard to workout what the best way to internal link these pages for authority. At the moment they're classified in their relevant categories, but these go from level 1 down to 4 - is this too many classification levels?
| BeckyKey1 -
Is it possible to predict the future DA of a site?
Is there a method to predict the future DA of a site if I know the DA and PA of x sites that will be linking to them in the future? All inbound links will be pointing to the home page.
| richdan0 -
How to create AMP Pages for product website?
How to create AMP Pages for product website? I mean we can create it easily when we have wordpress through plugin, what about when we have millions of pages, It would be too tedious to create amp version of every page. So, is there any alternative way to create amp version?
| sachin.kaushik0 -
D.A. and Link Juice from certain websites
Hello, i'm a bit confused concerning some link buildings. What happens with backlinks from blogs coming from powerful domains such as abc.yahoo.com or abc**.over-blog.com or abc.blogspot.com?** Meaning, everyone that creates a blog over there will have a PA of 0/100 but a DA of 80 or 90/100.
| prozis
Will Google consider DA on these cases? I'm confused because it can't be that simple that someone creates a website, and after some months they will have like 10 PA but still the 90/100 DA and their links can be a powerful backlink. Can you explain me how Google sees that? So, if I have a link coming from a blog on those domains will it be better than any other with same PA but lower DA?0 -
Google Flux in Rankings Or Something More Serious
Hi all, Two weeks ago i noticed that one of our pages which normally ranks in the top 5 of search results dropped out of the top 50 results. I checked to make sure there were no Google penalties and checked to make sure the page was crawlable. Everything seemed fine and after a few hours our page went back into the number one position. I assumed it was a Google Flux. This number one ranking lasted about a week, today I see my page has dropped out of the top 50 yet again and hasn't come back up. again there are no penalties and there doesn't seem to be issues with the page. I'm hoping it comes back up to the top by tomorrow. What could be causing such a big dip multiple times?
| znotes0 -
Community Discussion - What old-school SEO tactics no longer work? Which ones still do?
Hi there, friends! This week's discussion comes from today's Whiteboard Friday: Rand's outlined SEO practices that are outdated and no longer effective. Did anything catch you off-guard, making you want to pivot your strategy? Anything that you disagree with, or that you feel still works well regardless? What other tactics, in your experience, no longer work?
| MattRoney4 -
SEO Menu Question
I have a question regarding to the SEO benefits of different types of menus. Recently, I have noticed an increasing number of websites with the sort of menu like at www.sportsdirect.com, where there is only one main dropdown and then everything is a sub-menu of the sub-menus if that makes sense. Is this approach more, less or equal beneficial to what you see at http://www.wiggle.co.uk/ where there are multiple initial dropdown menus? Appreciate the feedback.
| simonukss0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
| thekiller990 -
Best Format to Index a Large Data Set
Hello Moz, I've been working on a piece of content that has 2 large data sets I have organized into a table that I would like indexed and want to know the best way to code the data for search engines while still providing a good visual experience for users. I actually created the piece 3 times and am deciding on which format to go with and I would love your professional opinions. 1. HTML5 - all the data is coded using tags and contains all the data on page in the . This is the most straight forward method and I know this will get indexed; however, it is also the ugliest looking table and least functional. 2. Java - I used google charts and loaded all the data into a
| jwalker880 -
How and When Should I use Canonical Url Tags?
Pretty new to the SEO universe. But I have not used any canonical tags, just because there is not definitive source explaining exactly when and why you should use them??? Am I the only one who feels this way?
| greenrushdaily0 -
Tens of thousands of links to less than 10 pages from 1 domain
I was looking at 'Links to your site' in Google Search Console and noticed our website has around 60 thousand followed links to 8 pages from a single domain. This is happening because we run an ad on the referring domain (a blog) and the ad is in the sidebar along with other ecommerce stores in the same niche. As a result, our ad, and most of our competitor's ads, have links showing up across this blog's entire site. Are this many links, and the type of links, a problem for SEO? I'm wondering if it would be wise to discontinue this advertising. While we get a very modest amount of traffic as a result of the ad, it doesn't convert very well, and I'm wondering if there might be any SEO benefit to not having all of these inbound links from a single domain coming in. Thanks! J
| vcj0 -
Need a layman's definition/analogy of the difference between schema and structured data
I'm currently writing a blog post about schema. However I want to set the record straight that schema is not exactly the same as structured data, although both are often used interchangeably. I understand this schema.org is a vocabulary of global identifiers for properties and things. Structured data is what Google officially stated as "a standard way to annotate your content so machines can understand it..." Does anybody know of a good analogy to compare the two? Thanks!
| RosemaryB0 -
Deleting Outdated News Pages??
Hi everyone, I'm currently doing a full content audit for my company, in preparation for a website redesign. I've discovered thousands of pages (dating all the way back to 2009) with thin, outdated, and irrelevant content. ie: real estate news and predictions that are now super old news. According to analytics, these older pages aren't receiving any traffic, so I think the best course of action is to delete these pages & add 404 redirects. In my opinion, this should be a big priority, because these pages are likely already hurting our domain authority to some extent & it's just a matter of time before we're really penalized by Google. Some members of my team have a different opinion -- they worry that deleting 1000 pages could hurt our rankings, and they want to wait and discuss the issue further in 3Q or 4Q (once the site redesign is completed and we have time to focus on it). Am I wrong to think that waiting is a very bad idea? Google will notice that we've done a major site redesign--we've written all new copy, optimized the UX & content organization to make info easier to find, created new lead magnets, optimized images, etc.-- but we didn't bother to update 1000 pages of outdated content that no one is looking at...won't that look bad? Do you agree that we should delete/merge all outdated content now, rather than waiting until after the site redesign? Or am I overreacting? Thanks so much for your help!
| JCon7110 -
Magento: Should we disable old URL's or delete the page altogether
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
| andyheath0 -
Best Practices for Converting PDFs to HTML
We're working with a client who gets about 80% of their organic, inbound search traffic from links to PDF files on their site. Obviously, this isn't ideal, because someone who just downloads a PDF file directly from a Google query is unlikely to interact with the site in any other way. I'm looking to develop a plan to convert those PDF files to HTML content, and try to get at least some of those visitors to convert into subscribers. What's the best way to go about this? My plan so far is: Develop HTML landing pages for each of the popular PDFs, with the content from the PDF, as well as the option to download the PDF with an email signup. Gradually implement 301 redirects for the existing PDFs, and see what that does to our inbound SEO traffic. I don't want to create a dip in traffic, although our current "direct to inbound" traffic is largely useless. Are their things I should watch out for? Will I get penalized by Google for redirecting a PDF to HTML content? Other things I should be aware of?
| atourgates0 -
Duplicate page content errors for Web App Login
Hi There I have 6 duplicate content errors, but they are for the WebApp login from our website. I have put a Noindex on the Sitemap to stop google from indexing them to see if that would work. But it didn't. These links as far as I can see are not even on the website www.skemaz.net, but are links beyond the website and on the Web App itself eg : <colgroup><col width="529"></colgroup>
| Skemazer
| http://login.skemaz.net |
| http://login.skemaz.net/LogIn?ReturnUrl=%2Fchangepassword |
| http://login.skemaz.net/Login |
| http://login.skemaz.net/LogIn?ReturnUrl=%2FHome | Any suggestions would be greatly appreciated. Kind regards Sarah0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
| andyheath0 -
Will we be penalised for duplicate content on a sub-domain?
Hi there, I run a WordPress blog and I use [community platform] Discourse for commenting. When we publish a post to Wordpress, a duplicate of that post is pushed to a topic on Discourse, which is on a sub-domain. Eg: The original post and the duplicated post. Will we be penalised for duplicating our own content on a subdomain? If so, other than using an excerpt, what are our options? Thanks!
| ILOVETHEHAWK0 -
Reviews not pulling through to Google My Business page
OK, a local SEO question! We are working with a plumbing company. A search for (Google UK) shows the knowledge panel with 20+ reviews. This is good! However, if you search for "plumbers norwich" and look at the map, thecompany is on the third page and has no reviews. I've logged into Google My Business, and it says the profile is not up to date and only 70% complete with no reviews. This is odd, as there was a fully complete profile recently. Any ideas on how best to reconcile the two? Thanks!
| Ad-Rank1 -
Desktop vs. Mobile Results
When googling on www.google.ca for "wedding invitations" and in my own geo location market of Toronto, my site - www.stephita.com, will show up differently on SERP on desktop (Chrome & IE) vs. mobile (iPad, iPhone, android, etc.). On desktop SERP, I will show up 6/7 position... (which is relatively a new position, the past 3 weeks - I was previously on page 2) (After a bunch of SEO fixes, I've managed to propel my site back to page 1!) On mobile SERP, I only show up on 1/2 position on PAGE 2 😞 As I mentioned above, I did a bunch of SEO fixes that I think were related to Panda/Penguin algos. So I'm wondering why my MOBILE SERP has NOT improved along the way? What should I be looking at to fix this 5-6 position differential? Thanks all!
| TysonWong0 -
Cleaning up user generated nofollow broken links in content.
We have a question/answer section on our website, so it's user generated content. We've programmed all user generated links to be nofollow. Over time... we now have many broken links and some are even structurally invalid. Ex. 'http:///.'. I'm wanting to go in and clean up the links to improve user experience, but how do I justify it from an SEO standpoint and is it worth it?
| mysitesrock0 -
Can new domain extensions rank?
Hi Does anybody know if it's possible to get domains with extensions like .party or .world to rank? Even for high competitive keywords? Can they rank over .com?
| MikeWU0 -
How long should it take for indexed pages to update
Google has crawled and indexed my new site, but my old URLS appear in the search results. Is there a typical amount of time that it takes for Google to update the URL's displayed in search results?
| brianvest0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
| yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
| lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Soft 404 error for a big, longstanding 301-redirected page
Hi everyone, Years ago, we acquired a website that had essentially 2 prominent homepages - one was like example.com and the other like example.com/htm... They served the same purpose basically, and were both very powerful, like PR7 and often had double listings for important search phrases in Google. Both pages had amassed considerable powerful links to them. About 4 years ago, we decided to 301 redirect the example.com/htm page to our homepage to clean up the user experience on our site and also, we hoped, to make one even stronger page in serps, rather than two less strong pages. Suddenly, in the past couple weeks, this example.com/htm 301-ed page started appearing in our Google Search Console as a soft 404 error. We've never had a soft 404 error before now. I tried marking this as resolved, to see if the error would return or if it was just some kind of temporary blip. The error did return. So my questions are:
| Eric_R
1. Why would this be happening after all this time?
2. Is this soft 404 error a signal from Google that we are no longer getting any benefit from link juice funneled to our existing homepage through the example.com/htm 301 redirect? The example.com/htm page still has considerable (albeit old) links pointing to it across the web. We're trying to make sense of this soft 404 observation and any insight would be greatly appreciated. Thanks!
Eric0 -
Sitemap with homepage URL repeated several times - it is a problem?
Hello Mozzers, I am looking at a website with the homepage repeated several times (4 times) on the sitemap (sitemap is autogenerated via a plugin) - is this an SEO problem do you think - might it damage SEO performance, or can I ignore this issue? I am thinking I can ignore, yet it's an odd "issue" so your advice would be welcome! Thanks, Luke
| McTaggart0 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
| FashionLux0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
| 3mil0 -
Google Search Console
abc.com www.com http://abc.com http://www.abc.com https://abc.com https://www.abc.com _ your question in detail. The more information you give, the better! It helps give context for a great answer._
| brianvest0 -
Handling of product variations and colours in ecommerce
Hi, our site prams.net has 72.000 crawled and only 2500 indexed urls according to deep crawl mainly due to colour variations (each colour has its own urls now). We now created 1 page per product, eg http://www.prams.net/easywalker-mini and noindexed all the other ones, which had a positive effect on our seo. http://www.prams.net/catalogsearch/result/?q=002.030.059.0 I might still hurt our crawl budget a lot that we have so many noindexed pages. The idea is now to redirect 301 all the colour pages to this main page and make them invisible. So google do not have to crawl them anymore, we included the variations in the product pages, so they should still be searchable for google and the user. Does this make sense or is there a better solution out there? Does anyone have an idea if this will likely have a big or a small impact? Thanks in advance. Dieter
| Storesco0 -
Local SEO - two businesses at same address - best course of action?
Hi Mozzers - I'm working with 2 businesses at the moment, at the same address - the only difference between the two is the phone number. I could ask to split the business addresses apart, so that NAP(name, address, phone number) is different for each businesses (only the postcode will be the same). Or simply carry on at the moment, with the N and Ps different, yet with the As the same - the same addresses for both businesses. I've never experienced this issue before, so I'd value your input. Many thanks, Luke
| McTaggart0 -
:Pointing hreflang to a different domain
Hi all, Let's say I have two websites: www.mywebsite.com and www.mywebsite.de - they share a lot of content but the main categories and URLs are almost always different. Am I right in saying I can't just set the hreflang tag on every page of www.mywebsite.com to read: rel='alternate' hreflang='de' href='http://mywebsite.de' /> That just won't do anything, right? Am I also right in saying that the only way to use hreflang properly across two domains is to have a customer hreflang tag on every page that has identical content translated into German? So for this page: www.mywebsite.com/page.html my hreflang tag for the german users would be: <link < span="">rel='alternate' hreflang='de' href='http://mywebsite.de/page.html' /></link <> Thanks for your time.
| Bee1590 -
Redirecting to Modal URLs
Hi everyone! Long time no chat - hope you're all well! I have a question that for some reason is causing me some trouble. I have a client that is creating a new website, the process was a mess and I am doing a last minute redirect file for them (long story, for another time). They have different teams for different business categories, so there are multiple staff pages with a list of staffers, and a link to their individual pages. Currently they have a structure like this for their staff bios... www.example.com/category-staff/bob-johnson/ But now, to access the staffers bio, a modal pops up. For instance... www.example.com/category-staff/#bob-johnson Should I redirect current staffers URLs to the staff category, or the modal URL? Unfortunately, we are late in the game and this is the way the bio pages are set up. Would love thoughts, thanks so much guys!!
| PatrickDelehanty0 -
Optimize homepage for brand-name or local keyword query for franchisees?
We are launching quasi "microsites" for our franchisees within our main company website. My question is wether to optimize the homepage for the franchisee's business name OR a local keyword query? I know standard practice is to optimize the homepage for your company's brand-name, but in this case, brand awareness isn't strong for these new local businesses. Also, apart from the homepage and service page, there is not much room to create optimize content- other pages are simply reviews, projects and a contact us page. Thanks!!
| kimberleymeloserpa0 -
Sitelinks Not Appearing
We've have been ranking number 1 for all brand related terms but yet still sitelinks won't appear. We have submitted an xml sitemap We have a front facing sitemap in the footer We have a clear hierarchy of pages on the site We have a strong link profile We have a few hundred visits a month from organic search We are running Google Shopping ads Is there anything else that I should be doing?
| the-gate-films0 -
Preserving link equity from old pages
Hi Moz Community, We have a lot of old pages built with Dreamweaver a long time ago (2003-2010) which sit outside our current content management system. As you'd expect they are causing a lot of trouble with SEO (Non-responsive, duplicate titles and various other issues). However, some of these older pages have very good backlinks. We were wondering what is the best way to get rid of the old pages without losing link equity? In an ideal world we would want to bring over all these old pages to our CMS, but this isn't possible due to the amount of pages (~20,000 pages) and cost involved. One option is obviously to bulk 301 redirect all these old pages to our homepage, but from what we understand that may not lead to the link equity being passed down optimally by Google (or none being passed at all). Another option we can think of would be to bring over the old articles with the highest value links onto the current CMS and 301 redirect the rest to the homepage. Any advice/thoughts will be greatly appreciated. Thumbs up! Thanks,
| 3gcouk0 -
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
| jpuzakov0 -
Our parent company has included their sitemap links in our robots.txt file - will that have an impact on the way our site is crawled?
Our parent company has included their sitemap links in our robots.txt file. All of their sitemap links are on a different domain and I'm wondering if this will have any impact on our searchability or potential rankings.
| tsmith1310 -
Implications of extending browser caching for Google?
I have been asked to leverage browser caching on a few scripts in our code. http://www.googletagmanager.com/gtm.js?id=GTM-KBQ7B5 (16 minutes 22 seconds) http://www.google.com/jsapi (1 hour) https://www.google-analytics.com/plugins/ua/linkid.js (1 hour) https://www.google-analytics.com/analytics.js (2 hours) https://www.youtube.com/iframe_api (expiration not specified) https://ssl.google-analytics.com/ga.js (2 hours) The number beside each link is the expiration for cache applied by the owners. I'm being asked to extend the time to 24 hours. Part of this task is to make sure doing this is a good idea. It would not be in our best interest to do something that would disrupt the collection of data. Some of what I'm seeing is recommending having a local copy which would mean missing updates from ga/gtm or call for the creation of a cron job to download any updates on a daily basis. Another concern is would caching these have a delay/disruption in collecting data? That's an unknown to me – may not be to you. There is also the concern that Google recommends not caching outside of their settings. Any help on this is much appreciated. Do you see any issues/risks/benefits/etc. to doing this from your perspective?
| chrisvogel0 -
Hide and display iframes on different devices
I have an iframe on my website, I'd like to hide it when a user is browsing with a mobile device and display a different one for that user (which will be hidden on desktop). Is it possible that Google views it as cloaking? does it qualify as hidden content?
| OrendaLtd0 -
Multiple Instances of the Same Article
Hi, I'm having a problem I cannot solve about duplicate article postings. As you will see from the attached images, I have a page with multiple variants of the same URL in google index and as well as duplicate title tag in the search console of webmasters tools. Its been several months I have been using canonical meta tags to resolve the issue, aka declare all variants to point to a single URL, however the problem remains. Its not just old articles that stay like that, even new articles show the same behaviour right when they are published even thought they are presented correctly with canonical links and sitemap as you will see from the example bellow. Example URLs of the attached Image All URLs belonging to the same article ID, have the same canonical link inside the html head. Also because I have a separate mobile site, I also include in every desktop URL an "alternate" link to the mobile site. At the Mobile Version of the Site, I have another canonical link, pointing back to the original Desktop URL. So the mobile site article version also has Now, when it comes to the xml sitemap, I pass only the canonical URL and none of the other possible variants (to avoid multiple indexing), and I also point to the mobile version of the article.
| ioannisa
<url><loc>http://www.neakriti.gr/?page=newsdetail&DocID=1300357</loc>
<xhtml:link rel="alternate" media="only screen and (max-width: 640px)" href="http://mobile.neakriti.gr/fullarticle.php?docid=1300357"><lastmod>2016-02-20T21:44:05Z</lastmod>
<priority>0.6</priority>
<changefreq>monthly</changefreq>
image:imageimage:lochttp://www.neakriti.gr/NewsASSET/neakriti-news-image.aspx?Doc=1300297</image:loc>
image:titleΟΦΗ</image:title></image:image></xhtml:link></url> The above Sitemap snippet Source: http://www.neakriti.gr/WebServices/sitemap.aspx?&year=2016&month=2
The main sitemap of the website: http://www.neakriti.gr/WebServices/sitemap-index.aspx Despite my efforts you see that webmasters tools reports three variants for the desktop URL, and google search reports 4 URLs (3 different desktop variant urls and the mobile url). I get this when I type the article code to see if what is indexed in google search: site:neakriti.gr 1300297 So far I believe I have done all I could in order to resolve the issue by addressing canonical links and alternate links, as well as correct sitemap.xml entry. I don't know what else to do... This was done several months ago and there is absolutelly no improvement. Here is a more recent example of an article added 5 days ago (10-April-2016), just type
site:neakriti.gr 1300357
at google search and you will see the variants of the same article in google cache. Open the google cached page, and you will see the cached pages contain canonical link, but google doesn't obey the direction given there. Please help! duplicate-articles.jpg duplicate-articles-in-index.jpg0
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.