Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Set Robots.txt file to crawl my website at specific times
Our website provider has stated that they can only 'lift' their block on our website in order for it to be crawled as specific times. Is there any way to amend a robots.txt to ensure that it crawls our website at a specific time of day/night in order to coincide with the block being lifted? Many Thanks, Charlene
| CharleneKennedy120 -
Does non-critical AMP errors prevent you from being featured on Top Stories Carousel?
Consider site A which is a news publishing site that has valid AMP pages with non-critical AMP pages (as notified within Search Console). Also, Site A publishes news articles from site B (its partner site) and posts it on site A which have AMP pages too but most of them are not valid AMP pages with critical AMP errors. For brand terms like Economic Times, it does show a top stories carousel for all articles published by Economic Times, however it doesn't look the same for site A (inspite of it having valid AMP pages). Image link: http://tinypic.com/r/219bh9j/9 Now that there are valid AMP pages from site A and invalid AMP pages from site B on site A, there have been instances wherein a news article from site A features on the top stories carousel on Desktop for a certain query whereas it doesn't feature on the mobile SERPs in spite of the page being a valid AMP page. For example, as mentioned in the screenshot below: Business Today ranks on the Top Stories carousel for a term like “jio news” on Desktop, but on Mobile although the page is a valid AMP page, it doesn’t show as an AMP page within the Top Stories Carousel. Image Link: http://tinypic.com/r/11sc8j6/9 There have been some cases where although the page is featured on the top carousel on desktop, the same article doesn't show up on the mobile version for the same query on the Top Stories Carousel. What could be the reason behind this? Also, would it be necessary to solve both critical and non-critical errors on site A (including those published from site B on site A)?
| Starcom_Search1 -
Site ranks good for all keywords except the important ones
Hey guys I am the marketing manager for https://www.tadibrothers.com/, i started a year ago when the website was ranked very poorly on most keywords, I got most of our nitch keywords rank well to fist page except the most important ones. backup camera
| TadiBrothers
backup camera system
wireless backup camera system
rear view camera Our backlinks campaign is doing well 15 backlinks on average every month.
on-site optimization is also good.
Our Domain authority is better than all our competitors. Can anyone Please help me to understand why they do not rank well? Hope to hear from someone soon1 -
Googlebot being redirected but not users?
Hi, We seem to have a slightly odd issue. We noticed that a number of our location category pages were slipping off 1 page, and onto page 2 in our niche. On inspection, we noticed that our Arizona page had started ranking in place of a number of other location pages - Cali, Idaho, NJ etc. Weirdly, the pages they had replaced were no longer indexed, and would remain so, despite being fetched, tweeted etc. One test was to see when the dropped out pages had been last crawled, or at least cached. When conducting the 'cache:domain.com/category/location' on these pages, we were getting 301 redirected to, you guessed it, the Arizona page. Very odd. However, the dropped out pages were serving 200 OK when run through header checker tools, screaming frog etc. On the face of it, it would seem Googlebot is getting redirected when it is hitting a number of our key location pages, but users are not. Has anyone experienced anything like this? The theming of the pages are quite different in terms of content, meta etc. Thanks.
| Sayers0 -
How do Yelp and Justia get all the extra Meta Description Real estate?
I was doing some KW research for a client and noticed something interesting with regard to Yelp and Justia. For a search on DWI Attorneys, they each had over 300 character meta descriptions showing on the SERP without truncating. Everyone else was either truncated or within limit of roughly 160 characters. Obviously if there is a way to get something other than a list to show that way you can own some real estate. Would love to hear from some of you Mozzers on this. Here are two images that should assist. Best Edit: I found one that was not a directory site and it appears it is Google doing it. The site has no meta description for the home page and this is what is being pulled by Google. There are 327 characters here! The truncation marks are showing it being pulled from different parts of the page. Image is Killeen DWI Attorney. NOTE None of these are clients, etc. I also changed the cities so this is a general search. zAQpA qZ9KI 06p7U
| RobertFisher1 -
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer
| MJTrevens0 -
What's wrong with the algorithm?
Is it possible that Google is penalising a specific page and in the same time it shows unrelated page in the search results? "rent luxury car florence" shows https://lurento.com/city/munich/on the 2nd page (that's Munich, Germany) and in the same time completely ignores the related page https://lurento.com/city/florence/ How I can figure out if the specific page has been trashed and why? Thanks,
| lurento.com
Mike0 -
What is the best way to add semantic linked data to WordPress?
As a recent Moz subscriber, I'm trying to up my game in terms of inbound marketing. One of the most pressing tasks is to add json-ld across all of my WordPress sites. What is the best way to do this? Should I use the technique set out here: https://moz.rankious.com/_moz/blog/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags Or should I use one of these plugins? https://en-gb.wordpress.org/plugins/schema/ https://en-gb.wordpress.org/plugins/wp-structuring-markup/ I want to get this right so any guidance would be gratefully received.
| treb0r0 -
Migrating 1 page to https
Hi there, Although we do plan to migrate our entire site to https, we currently have only one page on our site which we want to migrate to https, as it has passwords and wotnot on it. Lets say it is httpS://www.example.com/login/ I am not really sure how to go about migrating just one page though.
| unirmk
I am guessing that we need to: make sure that httpS://www.example.com/login/ is the only page that exists on httpS:// replace any link to the httpS://www.example.com/login/ version on the http:// version remove httpS://www.example.com/login/ from the http://www.example.com/sitemap.xml create a httpS://www.example.com/sitemap.xml on httpS:// which only references the one page (httpS://www.example.com/login/) 301 http://www.example.com/login/ to https://www.example.com/login/ submit both sitemaps to Google so they know whats up. fetch http://www.example.com/login/ so that google finds the redirect. Anything else?? :S Not too sure about this one. Many thanks for your help.0 -
404s clinging on in Search Console
What is a reasonable length of time to expect 404s to be resolved in Search Console? There was a mass of 404s that were built up from directory changes and filtering URLs that have been fixed. These have all been fixed but of course there are some that slipped the net. How long is it reasonable to expect the old 404s that don't have any links to drop away from Search Console? New 404s are still being reported over 4 months later. 'First detected' is always showing as a date later than the fixed 404's date. Is this reasonable, i've never seen this being so resilient and not clean up like this? We manually fix these 404s and like popcorn more turn up. Just to add the bulk of 404s came into existence around a year ago and left for around 8 months.
| MickEdwards0 -
SEO suggestions for a directory
Hi all, I am new to SEO. I work for a ratings and review website, like TripAdvisor and LinkedIn. How would one go about setting up SEO strategy for national directories that have local suggested pages? What can be a good practice. For example, Tripadvisor has many different restaurants across the UK. What would they do to improve their SEO? How do they target correct links? How do they go about building their Moz Score? Would really appreciate your thoughts and suggestions. Thanks!
| Eric_S
Eric0 -
What tool is best for knowing how many searches are done on a string in a specific country?
I need to know which tool gives me more accurate data, for example in Keyword Explorer, how accurate is the number of searches performed by search engine?
| Jorgesep0 -
How many links to the same page can there be for each page?
I need to know if I can add more than 2 equal links on the same page, for example 1 link in the header, another in the body and one in the footer
| Jorgesep0 -
Should I switch my website builder/host? Please help.
My website: www.joeborders.com is hosted with a service called jigsy: www.jigsy.com. I'm losing my mind trying to figure out if I should stay or not. Lol. I am positive I have done waaaayyy more work on my seo than many people ranking above me. I used to be on the first page, but over the last year I've slowly dropped in rankings. I've checked everything! I need to do some work on my blog, but I'm really thinking now that it might have something to do with my host. Some concerns I've identified: 1) I can't give pages individual h1 tags. The same one is blanketed across the site. 2) I'm told there are a lot of .css and JavaScript. 3) i cant redirect blog posts.....so moz is tagging me with 250 critical issues because my posts are on both www and http versions of my site .But that's all I know. I've talked with squarespace and WordPress and they have no way of transferring my site. It would probably take me a good 30 hours to set everything up....should i move? Please help 😞
| joebordersmft0 -
Pagination & SEO
Hi We have automatically created brand pages based on which brand they have in their attributes. At the moment, developers have restricted the ability to properly optimise these for SEO, but I also wanted to look at how we should handle pagination. Example: http://www.key.co.uk/en/key/brand/manutan?page=1 http://www.key.co.uk/en/key/brand/manutan?page=2 http://www.key.co.uk/en/key/brand/manutan?page=3 Should we do any of the following - which I've found in an article: Put no follow on all links located on pagination pages Should we no index these pages as they are wasting crawl budget? - Don’t show links to page 2, 3, 4, 5… 10, 11, 12… at the end of your content but only a link to the next and previous pages so that you won’t dilute your page authority. Or does anyone else have any tips on how to handle these pages? Thank you!
| BeckyKey0 -
How to build Domain Authority?
My site: https://www.fishingspots.com.au/ has started to drop Domain Authority in the past weeks, however less quality sites like http://silverstories.com.au/ are rising... I am not sure why? Is there someway I can understand why my site would suddenly start dropping authority?
| thinkLukeSEO0 -
Positions dropped and do not recover
Hi all, Site: https://www.fascinators.com.au k/w: fascinators (this is the main one I am looking at) About 2 years ago, I redesigned the site. It used to be on first positions in Google Australia. After redesign, positions dropped and never recovered. No matter what I do, I can not get them back. I even involved professional SEO agency for the last 6 month to help build links, but nothing works. (old site used really old non mobile-friendly engine) As far as I know site is technically sound with no issues. In the last few month position for my main keyword 'fascinators' jumps really wildly. From position 9 to not being in Google first 100. And jump can happen overnight. As of today it disappeared from first 100 again. I am really tearing my hair out on what can be wrong. Link building is underway. This is one area I am aware of. Thinking of keyword spamming, i removed lots of 'fascinators' from URL, but that did not seem to help much. Any ideas will be greatly appreciated! Thanks, Rudolf
| rudolfl1 -
Help! Rankings dropping after optimising
I have been using MOZ a lot, in the past I have always been able to optimise a website enough to rank well in my local areas, but lately every time I optimise a website based on MOZ recommendations the rankings are just dropping and dropping... I haven't focused any efforts on backlinks, but a few sites have gone from the first page of SERP to 2nd and continue to drop... I have 2 example sites: https://www.documentmanagementsoftware.com.au/ - optimised for "Document Management Software" - was initially ranked no.7 in google AU, now it is no.16 after my efforts. http://www.tmphysio.com.au/ - optimised for "canberra physiotherapy" - was no.6 now it is dropping to no.9 after 'optimisation' Any help or insights would be extremely helpful as I feel hopeless!
| thinkLukeSEO0 -
Http & https domain names
We currently have a site which we found SEM Rush to show that their were duplicate pages for the site. Upon further inspection we realized this was because there existed both http:// and https:// Versions of the site. Is this a problem for Google that the site appears for both http:// and https:// and that there are therefore duplicate versions of the site?
| Gavo0 -
How to rank an ecommerce site for search terms starting with how where why
Hi guys, I just got a new SEO job for an e commerce store, the client is asking to rank the site for keywords like where to buy used phone, where to sell my used phone for for best rates and so, the question is how can i achieve that, can anyone help me with some concrete suggestion? Thanks in Advance,
| mkhurramali0 -
Domain name change
Here's the scenario... Client has two domain names: domain.com - targeting one country (Australia) otherdomain.com - targeting all other countries Both have identical products, but different currencies (AU$ and US$). The problem (as most of you will know) is that without using a sub-domain or country-code top-level domains, Google has no idea which domain should be served for which domain. Furthermore, because the root domain is different, Google doesn't see any connection between the two - other than the fact they have identical products! My recommendation to the client is to change to: domain.com to domain.com.au otherdomain.com to domain.com Arguably, we could leave the second one alone. But I think it's better for the brand to use the same root domain for each. Obviously this means both will need to be redirected. Since NONE of the pages within the sites will change, do we need to redirect every page, or just the root domain? Any other risks or concerns we should know about?
| muzzmoz0 -
How to stagger <h>tags?</h>
This might seem like a silly question, but It's one that I would like to get some responses from the SEO community. Do <h>tags need to be staggered according to the numbers? For example: A few of our clients have their h1 tag listed on a mid-way header that is halfway down their page, and there are both h2's and h3's listed before the h1 in the source code. Does this matter? Let me know!
| TaylorRHawkins
Thanks! </h>2 -
HTTP HTTPS Migration Gone Wrong - Please Help!
We have a large (25,000 Products) ecommerce website, and we did an HTTP=>HTTPS migration on 3/14/17, and our rankings went in the tank, but they are slowly coming back. We initially lost 80% of our organic traffic. We are currently down about 50%. Here are some of the issues. In retrospect, we may have been too aggressive in the move. We didn't post our old sitemaps on the new site until about 5 days into the move. We created a new HTTPS property in search console. Our redirects were 302, not 301 We also had some other redirect issues We changed our URL taxonomy from http://www.oursite.com/category-name.html to https://www.oursite.com/category-name (removed the .html) We changed our filters plugin. Proper canonicals were used, but the filters can generate N! canonical pages. I added some parameters (and posted to Search Console) and noindex for pages with multiple filter choices to cut down on our crawl budget yesterday. Here are some observations: Google is crawling like crazy. Since the move, 120,000+ pages per day. These are clearly the filtered pages, but they do have canonicals. Our old sitemaps got error messages "Roboted Out". When we test URLs in Google's robots.txt tester, they test fine. Very Odd. At this point, in search console
| GWMSEO
a. HTTPS Property has 23,000 pages indexed
b. HTTP Property has 7800 pages indexed
c. The crawl of our old category sitemap (852 categories) is still pending, and it was posted and submitted on Friday 3/17 Our average daily organic traffic in search console before the move was +/-5,800 clicks. The most recent Search Console had HTTP: 645 Clicks HTTPS: 2000 clicks. Our rank tracker shows a massive drop over 2 days, bottoming out, and then some recovery over the next 3 days. HTTP site is showing 500,000 backlinks. HTTPS is showing 23,000 backilinks. I am planning on resubmitting the old sitemaps today in an attempt to remap our redirects to 301s. Is this typical? Any ideas?0 -
Crawl Depth improvements
Hi I'm checking the crawl depth report in SEM rush, and looking at pages which are 4+ clicks away. I have a lot of product pages which fall into this category. Does anyone know the impact of this? Will they never be found by Google? If there is anything in there I want to rank, I'm guessing the course of action is to move the page so it takes less clicks to get there? How important is the crawl budget and depth for SEO? I'm just starting to look into this subject Thank you
| BeckyKey0 -
Long Title Tags
Hi guys, We have product e-commerce title tags which are over 60 characters - around 80 plus. The reason we added them in there is to incorporate
| seowork214
more information for Google. The format of these title tags are: Name + Colour + Rug Type + Origin Name = for people searching for the name of the rug
Color = people searching for a specific color
Type = The type of rug (e.g. normal or designer)
Origin = Where the rug is for. So this title will cover people searching for: People searching for designer rugs, the specific colour and also where it comes from. This then results in the title tag going way over 60 characters - around 80-90 characters. -- Would it be wise to try and shrink it down to under 60 characters, and what would be a good approach to do this? Cheers.0 -
Hidden category content really bad?
Hi Guys, I'm working with a site which has hidden based category content see: http://i.imgur.com/Sgko2we.jpg It seems google are still indexing these pages but i heard Google might ignore or reduce the benefit of hidden content like this.I just want to confirm if this is the case? And if this is a really bad thing for SEO?Cheers.Sgko2we.jpg
| seowork2140 -
Google cache from my website give another website
Hello, Some time ago, I already asked a question here because my homepage disappeared from Google for our main keyword. One of the problems that we showing up was the Google cache. If you look to the cache of the website www.conseilfleursdebach.fr, you see that it show the content of www.lesfleursdebach.be. It's both our website, but one is focus on France and the other one on Belgium. http://webcache.googleusercontent.com/search?q=cache%3Awww.conseilfleursdebach.fr&oq=cach&aqs=chrome.0.69i59j69i57j0j69i60j0l2.1374j0j4&sourceid=chrome&ie=UTF-8 Before, there were flags on the page to go to the other country, but in the meantime I removed all links from the .fr to the .be and opposite. This is ongoing since January. Who has an idea of what can cause this and most of all, what do do? Kind regards, Tine
| TineDL1 -
Google's 'related:' operator
I have a quick question about Google's 'related:' operator when viewing search results. Is there reason why a website doesn't produce related/similar sites? For example, if I use the related: operator for my site, no results appear.
| ecomteam_handiramp.com
https://www.google.com/#q=related:www.handiramp.com The site has been around since 1998. The site also has two good relevant DMOZ inbound links. Any suggestions on why this is and any way to fix it? Thank you.0 -
Link conundrum - losing nav/footer links in mobile view
Hi Moz folks! I'm currently moving a site from being hosted on www. and m. separately to a responsive single URL. The problem is, the desktop version currently has links to important landing pages in the footer (about 60) and that's not something we want to replicate on mobile (mainly because it will look pretty awful.) There is no navigation menu because the key to the homepage is to convert users to subscription so any distraction reduces conversion rate. The footer links will continue to exist on the desktop view but, since Google's mobile-first index, presumably we lose these important homepage links to our most important pages. So, my questions: Do you think there is any SEO value in the desktop footer links? Do you have any suggestions about how best to include these 60-odd links in a way that works for mobile? Thanks!
| d_foley0 -
Best tools for identifying internal duplicate content
Hello again Mozzers! Other than the Moz tool, are there any other tools out there for identifying internal duplicate content? Thanks, Luke
| McTaggart0 -
Ranking after redirecting two URLs to a new domain
I run two websites which operate in similar business sectors. Each has a calculator tool that offers the same functionality. The pages rank 2nd and 5th for the key search term. I'd like to improve the functionality of this and have thought about setting up a new domain for this calculator to move it away from the main sites. If I did this and 301 redirected both pages to the new domain do you think I'd maintain a strong ranking position for this search term on the new domain? Thanks for any advice.
| craigramsay0 -
Href Lang & Canonical Tags
Hi I have 2 issues appearing on my site audit, for a number of pages. I don't think I actually have an issue but just want to make sure. Using this page as an example - http://www.key.co.uk/en/key/0-5-l-capacity-round-safety-can-149p210 The errors I get are: 1. Conflicting hreflang and rel=canonical Canonical page points to a different language URL - when using href & canonicals, it states I need a self referential canonical . The page above is a SKU page, so we include a canonical back to the original model page so we don't get lots of duplicate content issues. Our canonical will point to - http://www.key.co.uk/en/key/justrite-round-safety-cans 2. No self referencing hreflang. Are these big issues? I'd think the bigger issue would be if I add self referencing canonicals and end up with lots of duplicate content. Any advice would be much appreciated 🙂
| BeckyKey0 -
Site wide decrease in featured snippets
Hi, As of April we have seen a huge drop in the number of featured snippets and I'm unsure why. No major changes have been made to the site. We did submit to to Google news at a similar times (failed unfortunately) I have plotted our featured snippets and competitors featured snippets form data from Semrush and most our competitors have seen a fall in number of featured snippets. There has also been a large fall for some categories specifically. But overall we have seen the largest drop out of everyone. (See attached Graph) Does anyone have any clues as to why Google would remove multiple featured snippets from one website in particular. Thanks. OwseH
| Sally940 -
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
| ben_mozbot010 -
What to do about endless size pages
I'm working on a site that sells products that come in many different sizes. One product may come in 30 sizes.The products themselves are identical, except for the size. There are collections pages that are all of several kinds of product in a particular size and then there are individual product pages of one product the specific size. Collections pages for widgets size 30 is the same content as widgets size 29. A single product page for gold-widget-size-30 is the same content as the single product page gold-widget-size-29. To make matters worse, they all have the same tags and very little written content. The site is in Shopify. Last month there were almost 400 pages that produced visits on organic, mostly in the 1 to 4 per month range, but all together about 1000 visits. There are several hundred more that produced no traffic in organic, but are duplicate (except for size) and part of this giant ball of tangled string. What do you think I should do? Thanks... Mike
| 945010 -
Drop in Ranking for a Reputable Dictionary / April 14, 2017
One of the sites of my customers (a very reputable, high quality dictionary) saw a 46% drop in rankings starting from April 14, 2017. Has anyone else seen something similar and has some tips? Many thanks, Tatiana
| lascribacchina0 -
Re: Inbound Links. Whether it's HTTP or HTTPS, does it still go towards the same inbound link count?
Re: Inbound Links. If another website links to my website, does it make a difference to my inbound link count if they use http or https? Basically, my site http://mysite.com redirects to https://mysite.com, so if another website uses the link http://mysite.com, will https://mysite.com still benefit from the inbound links count? I'm unsure if I should reach out to all my inbound links to tell them to use my https URL instead...which would be rather time consuming so just checking http and https counts all the same. Thanks.
| premieresales0 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
| danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
My "search visibility" went from 3% to 0% and I don't know why.
My search visibility on here went from 3.5% to 3.7% to 0% to 0.03% and now 0.05% in a matter of 1 month and I do not know why. I make changes every week to see if I can get higher on google results. I do well with one website which is for a medical office that has been open for years. This new one where the office has only been open a few months I am having trouble. We aren't getting calls like I am hoping we would. In fact the only one we did receive I believe is because we were closest to him in proximity on google maps. I am also having some trouble with the "Links" aspect of SEO. Everywhere I see to get linked it seems you have to pay. We are a medical office we aren't selling products so not many Blogs would want to talk about us. Any help that could assist me with getting a higher rank on google would be greatly appreciated. Also any help with getting the search visibility up would be great as well.
| benjaminleemd1 -
Thought FRED penalty - Now see new spammy image backlinks what to do?
Hi, So starting about March 9 I started seeing huge losses in ranking for a client. These rankings continue to drop every week since and we changed nothing on the site. At first I thought it must be the FRED update, so we have started rewriting and adding product descriptions to our pages (which is a good thing regardless). I also checked our backlink profile using OSE on MOZ and still saw the few linking root domains we had. Another Odd thing on this is that webmasters tools showed many more domains. So today I bought a subscriptions to ahrefs and instantly saw that on the same timeline (starting March 1 2017) until now, we have literally doubled in inbound links from very spammy type sites. BUT the incoming links are not to content, people seem to be ripping off our images. So my question is, do spammy inbound image links count against us the same as if someone linked actual written content or non image urls? Is FRED something I should still be looking into? Should i disavow a list of inbound image links? Thanks in advance!
| plahpoy0 -
Help to identify that this SEO agency is doing a TERRIBLE job
Hi folks, I am working with a group for which I do SEO etc. for one part of the group. Another part of the group hired an SEO agency to carry out their SEO for them (before I joined). In short, they are doing a terrible job by building links in very dodgy directories (ones which get taken offline) and via machine generated 'articles' on horrendously bad 'blogs'. Please take a look at these 'articles' and leave your thoughts below so I can back up the point that these guys are not the kind of SEOs we should be working with. [List of links to articles removed by moderator] Many thanks in advance, Gill.
| Cannetastic0 -
Combining multiple HTTPS sites
Hi there! I am currently combining several sites (corporate brochure site and ecommerce site) for a client into one central website. All of the content and structure on the new site is set up and relevant pages have 301 redirects ready. My main concern is that the old .co.uk website has an SSL certificate and will be pointing to the new pages on the new .com website (with new SSL in place). Will this cause connection privacy issues? And if so, what's the best way to resolve them? Many thanks!
| Daniel_GlueMedia0 -
Absolute vs. Relative Canonical Links
Hi Moz Community, I have a client using relative links for their canonicals (vs. absolute) Google appears to be following this just fine, but bing, etc. are still sending organic traffic to the non-canonical links. It's a drupal setup. Anyone have advice? Should I recommend that all canonical links be absolute? They are strapped for resources, so this would be a PITA if it won't make a difference. Thanks
| SimpleSearch1 -
Should I revive the old domain or just redirect all the juicy links to my new site?
I'm about to acquire a domain with a lot of great/highly authoritative backlinks. The links pointing to the domain are quite powerful and the domain is an exact match TLD. I have two options (that I know of 😞 1. I could redirect all the links to their new home(s) on my new site which offers the same resources the old site used to offer. or 2. I could rebuild the tools/content on this site. Ideally, I'd transfer to my new site as all those powerful links could help all my rankings. However, I'm worried that some of the powerful links will de-link once they see the site redirects elsewhere, even though it's offering the same content. Also, option one isn't an exact match domain. Which, I know, shouldn't make a difference now-a-days but regardless of what people say, it still seems to help out some sites in less competitive niches. One more thing to note: The domain that I'm purchasing is about 25 years old. I'm leaning toward option one. I want to make sure I put my best foot forward on this investment and thought it wise to consult the SEO gods.
| ninel_P0 -
E-Commerce Site Collection Pages Not Being Indexed
Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
| Ben-R
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9545580/checkouts
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /apple-app-site-association
Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,0 -
Need advice on overcoming a Google penalty
Here is the situation. Our website for our primary product (www.thetablift.com) has received a penalty by Google. Not long ago we had excellent rankings; (1st page) for some of our primary keywords, like "tablet stand". Now we are not in the index at all. Here is what happened (or at least what seems to have happened in my non-SEO opinion). Around October 2016, we had the "bright" idea to try and emulate a campaign that Eat 24 did, utilizing inexpensive traffic from advertisements on porn websites. The idea was a play on a joke we often hear about our product being perfect for certain activities where one needs to free one's hands while watching a screen. Of course this is not how we market our product (it is a best selling mainstream product), but we wanted to see if we could emulate the success of another mainstream brand that utilized this kind of non-mainstream advertising. The immediate result was a whole lot of traffic, but obviously the wrong kind, as it did not convert. So we pulled the plug after about 3 days. Flash forward several months later and we not only lost our great SEO rankings, but we were removed from Google's index entirely. I assume the reason for this is that somehow the website got dinged for being somehow related to porn. But of course it has nothing to do with that. So the question is: how do we go about getting un-penalized by Google? We had build up some solid SEO over the previous couple of years, and I'd like to get back to where we were, if possible. Oh, and this may or may not be relevant, but we also switched from www.tablift.com to www.thetablift.com a few months before we did this campaign. However, we used permanent redirects and did a textbook changeover, so I don't think that had any bearing. But I can't be sure. What are the steps to reverse this damage, if any? Thanks!
| csblev0 -
Exact match Title and H1 tags, and over optimization
Hi Mozzers - was just wondering whether matching H1 and Title tags are still OK, or whether there's an over optimization risk if they exact match?
| McTaggart0 -
Robots.txt wildcards - the devs had a disagreement - which is correct?
Hi – the lead website developer was assuming that this wildcard: Disallow: /shirts/?* would block URLs including a ? within this directory, and all the subdirectories of this directory that included a “?” The second developer suggested that this wildcard would only block URLs featuring a ? that come immediately after /shirts/ - for example: /shirts?minprice=10&maxprice=20 BUT argued that this robots.txt directive would not block URLS featuring a ? in sub directories - e.g. /shirts/blue?mprice=100&maxp=20 So which of the developers is correct? Beyond that, I assumed that the ? should feature a * on each side of it – for example - /? - to work as intended above? Am I correct in assuming that?
| McTaggart0 -
Multi-Store SEO
I am currently developing a website which will have a multi-store function, i.e. one for US & ROW customers and one for UK & EU customers. The domain names will be along the lines of: Original domain: www.website.com UK & EU domain: eu.website.com US & ROW domain: us.website.com When a customer visits the website they will be redirected to one or the other depending on their location. Can anyone see any problems which this may cause in respect to SEO? I know there may be a duplicate content issue here also, how should I best deal with this?
| moon-boots0 -
BetterWidget or Better Widget? Brand name effect on SEO
I have a company that produces widgets. People generally search for "widget". I need to decide how to brand my company. Stylistically, I like BetterWidget, but I worry about the effects on SEO. If people are commonly searching for "widget", and my page title contains "BetterWidget", but no other use of the term, that may have a negative effect. 1. Is Google able to parse through these joined words? (this is similar but distinct from the discussion on common compound words like ice cream) 2. Does capitalization of the second "word" in the joined word help signal to Google that these are two terms joined together? 3. On the flip side, is there a negative effect from not having a unique "brand". In other words, articles that are about my company Better Widget may seem to google just to be about better widgets in general. Thanks
| galenweber1
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.