Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
    http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
    http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?

    | alexrbrg
    0

  • I have a site that for some reason Google decided to rank one of our articles #1 for a fairly competitive term. The article is kind of a BS blog post and I want to 301 it to our page about the topic as that's designed for conversion. If I do this, will we risk losing the ranking? If so, what are other options? Can I change the content of the ranked page to something closer to our landing page? Any advice is welcome!

    | dk8
    0

  • A clients site was moved to https recently. It's a small site with only 6 pages. One of the pages is to advertise an emergency service. HTTPS move worked fine. Submitted https to webmaster tools, submitted sitemap. 301 redirects. Rankings preserved. However, a few weeks later doing the site:example.com there are two pages for the emergency service. One says https the other is http. But the http one says the correct SEO title and the https one says an old SEO title. This wasn't expected. When you click the HTTP URL link it 301 redirects to the HTTPS url and the correct SEO title is displayed in the browser tab. When you click the HTTPS url link it returns a 200 and the correct SEO title is shown as expected in the browser tab. Anyone have any idea what is going on? And how to fix? Need to get rid of the HTTP URL but in the site search it contains the correct title. Plus- why is it there anyway?

    | AL123al
    0

  • Good Afternoon We had an issue a while ago with the incorrect pages ranking in Google for some of our key terms. For example the page ranking for the term in Hotels in Spain was an individual information page for one particular hotel in Spain rather than the top level page which is optimised for "Hotels in Spain" The individual property page was ranking around 36-40 so we tightened up all the internal linking structure to ensure the term "Hotels in Spain" was pointing to the correct page and de-optimised the individual property page for the term. After a few weeks, everything seemed to be working and we were ranking top of second page for correct page however, ranking report today has reversed our good fortune and the incorrect page is ranking in a low position Any further suggestions or advise would be very much appreciated. Ideally, I don't want to remove the page that is ranking as it's still relevant for a search for that particular hotel

    | Ham1979
    0

  • Hi there, Thank you so much for taking time out of your day to help. You people are stellar. When launching a new site with concern for preserving the site's organic placement, which attributes or data are the most important to keep consistent from the old site to the new one? For example, site structure, urls, meta data, image file names, and so on. Thanks again!

    | leslieevarts
    0

  • We are planning a series of large site migrations over the next 12-18 months, moving from one platform to another. It's likely the first will be completed by around Aug this year, with the process running until the back end of 2018. The sites are currently on http, and the plan is to first of all migrate all sites to https in the next couple of months. The concern is that, due to the http>https 301 redirects that will be in place, are we putting ourselves at unnecessary risk by effectively carrying out 2 migrations in the space of a year (in terms of loss of potential authority caused by redirects)? Would we be better to wait, and implement https at point of platform migration instead? Thoughts appreciated.

    | Sayers
    0

  • We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.

    | teddef
    0

  • Hi Guys, I'm looking to build links to a commercial page similar to this: https://apolloblinds.com.au/venetian-blinds/ How would you even create quality links (not against Google TOS) to a commercial page like that? Any ideas would be very much appreciated. Cheers.

    | spyaccounts14
    0

  • Hi Moz Community. Has anyone noticed a pattern in the websites that Google pulls in to populate knowledge Panels? For example, for a lot of queries Google keeps pulling data from a specific source over and over again, and the data shown in the Knowledge Panel isn't on the target page. Is it possible that Google simply favors some sites over others and no matter what you do, you'll never make it into the Knowledge box? Thanks.

    | yaelslater
    0

  • We launched a store on top of a popular blog. The blog had nothing to do with the store. The blog has a lot of backlinks and traffic, but our store is now our primary business. I am concerned that the off topic blog content may be affecting or ability to rank better for the core store business. Should we delete or redirect the old blog content to another website to improve the SEO for our store?

    | seo-mojo
    1

  • Hi Is there away to get the history of the Avg cost per click for a keyword?

    | Cocoonfxmedia
    0

  • I have seen an increasing instance of where the same company, one of our main competitors, are ranking in positions 1 and 2 for the same search phrase. It appears that both the homepage and dedicated service page relevant to the search term are ranking, but surely having them at position 1 & 2 is not something search engines like Google want to encourage? I have also seen other instances of the same company ranking twice on page 1 but not necessarily in #1 or #2. Is this an anomaly or just something I have to live with?

    | Kevbyrne
    2

  • Hi Guys, Doing a link audit and have come across lots of low quality web directories pointing to the website. Most of the anchor text of these directories are the websites URL and not comercial/keyword focused anchor text. So if thats the case should we even bother doing a link removal request via google webmaster tools for these links, as the anchor text is non-commercial? Cheers.

    | spyaccounts14
    0

  • Hi All, I am dealing with google news sitemap. My technical guy don't know how to create a site for google news. Do you know which service or company can help me with this? Thanks a lot!

    | binhlai
    0

  • What type of content would anyone recommend writing for power sports dealers (ATV dealers, Motorcycle Dealers, Jet Ski Dealers, etc) or outdoor power equipment dealers (tractors, lawn mowers, etc)  when they're website consists mainly of inventory pages? These dealers are trying to improve brand awareness, but creating content that answers searchers' questions/intent is tough and I want to make sure I am on the right track. I'm trying to create unique content. I am optimizing existing pages and then so far I've been writing brand pages, describing the brands, advertising that they carry this brand, creating links and call-to-actions to the inventory pages,etc. I want to first create authority and crawlable content for this brand. From there, I have been trying to create product category pages, describing the top products under that brand and working to creating product comparison content instead of simply describing it. Why Buy type of stuff, but that gets tricky to make unique. Any suggestions on unique content or better strategies versus just brand descriptions, product descriptions/comparisons, etc? I also want to make sure that creating multiple pages focused on one brand and an overall category isn't cannibalization of a topic. Obviously each page is slightly different and gradually going into more detail, but I want to make sure. Any recommendations on types of content or different strategies would be helpful! Also - I should mention that I am limited by the platform. I cannot create/utilize a blog page or anything like that. Thanks!

    | Crichardson1990
    0

  • Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.

    | eddys_kap
    0

  • If a site has about 100 pages offering specific discounts for employees at various companies, for example... mysite.com/discounts/target mysite.com/discounts/kohls mysite.com/discounts/jcpenney and all these pages are nearly 100% duplicates, how would you handle them? My recommendation to my client was to use noindex, follow. These pages tend to receive backlinks from the actual companies receiving the discounts, so obviously they are valuable from a linking standpoint. But say the content is nearly identical between each page; should they be indexed? Is there any value for someone at Kohl's, for example, to be able to find this landing page in the search results? Here is a live example of what I am talking about: https://www.google.com/search?num=100&safe=active&rlz=1C1WPZB_enUS735US735&q=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&oq=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&gs_l=serp.3...7812.8453.0.8643.6.6.0.0.0.0.174.646.3j3.6.0....0...1c.1.64.serp..0.5.586...0j35i39k1j0i131k1j0i67k1j0i131i67k1j0i131i46k1j46i131k1j0i20k1j0i10i3k1.RyIhsU0Yz4E

    | FPD_NYC
    0

  • Scenario: An automotive dealer lists cars for sale on their website. The descriptions are very good and in depth at 1,200 words per car. However chunks of the copy are copied from car review websites and weaved into their original copy. Q1: This is flagged in copyscape - how much of an issue is this for Google? Q2: The same stock with the same copy is fed into a popular car listing website - the dealer's website and the classifieds website often rank in the top two positions (sometimes the dealer on top other times the classifieds site). Is this a good or a bad thing? Are you risking being seen as duplicating/scraping content? Thank you.

    | Bee159
    0

  • Can I have the root domain pointing to one server and other URLs on the domain pointing to another server without redirecting, domain masking or HTML masking? Dealing with an old site that is a mess. I want to avoid migrating the old website to the new environment. I want to work on a page by page and section by section basis, and whatever gets ready to go live I will release on the new server while keeping all other pages untouched and live on the old server. What are your recommendations?

    | Joseph-Green-SEO
    0

  • I have a client who sells 50 brands of shoes. At the moment the developer has a noindex/nofollow tag on all sale pages which is wrong as around 10% of site activity revolves around those pages. The structure looks like this: 1. For Cats/Sub Cats site/sale
    site/womens/sale
    site/womens/shoe/sale
    site/womens/shoes/ballerinas/sale For every cat/subcat - there are 10 cats and average 5 subcats per cat so 50 pages of sale. 2. For Brands site/brand
    site/brand/womens
    site/sale/brand
    site/sale/womens/brand
    site/sale/womens/cat/brand
    site/sale/womens/cat/subcat/brand So each brand can have four sale pages on top of its own brand page. 50 brands x 54 = around 2700. Now no one is going to start writing 2700 pieces of additional on page content (although Meta is OK! ) and we risk further diluting the brand pages we need to show highly for,  so we need to do something. Should we Category Pages: 1. Allow all sale cat and subcat pages to proliferate through Google? or
    2. Canonicalise all sale sub category pages back to category
    3. Caonicalise all category and Subcategory pages back to sale/womens Brand Pages: 1. Allow all sale brand pages to proliferate through Google ?
    2. Canonicalise Sub Cat brand pages back to sale/category/brand
    3. Canonicalise Sub Cat and Cat back to sale/brand Note the lower pages never do well in search. If you search a brand + Sale in Google it is always the site/brand page that comes up, never the sale version (This is from research on other similar sites and my own analysis) Same with Sub Cats - eg, Brand + Subcat - it's always site/brand that comes up first wand has the highest PA. Also we can't analyse any of these sale pages in MOZ or anywhere else as they are not in search at all having been no indexed. That's my conundrum for today, Any thoughts would be appreciated!

    | Nigel_Carr
    0

  • | Hello guys,
    I hope you all doing well.I just signed up to this great forum to get some answers on my current SEO campaign, I hope I can find some help in here. I have very important question I hope some of SEO experts can help me please, it will mean the world to me. SO I'm trying to rank this clean,originally Spanish expired domain that I bought in Auction for big money. I added a lot of original content, some keywords start ranking already without any backlinks as the website has huge authority and a lot of quality backlinks, so now I want to start building backlinks to get my keywords on the first page. My question:
    I'm a bit worry about installing Google analytics on my site as I will be using gmail account thats based in Australia as I had to use Australian phone number to verify the account. What do you think? is it safe to do that ? (having google account based in Australia linked to website that I'm trying to rank in Spain) I would really appreciate an accurate answer, if you are not sure or if you have never done it before, please don't give me any expectations, as I'm investing a lot of money in this project. Any help would be extremely appreciated my friends. The other question is:
    I have bought couple of "new domains" and added some content, to add those domains to my PBN, just to mix up the link profile on my money site. How long do you wait on new domain name before you build any backlinks? I've already added content and social profiles to the new sites, I just need to know when can I start building backlinks to them to power them up as they are new. Thank you so much in advance, I'm looking forward to hear some opinions from experts. Thank you |

    | wajdisawaqed
    1

  • HI, I have a page that is about 5000 lines of code total. I was having difficulty figuring out why the addition of a lot of targeted, quality content to the bottom of the pages was not helping with rankings. Then, when fetching as Google, I noticed that only about 3300 lines were getting indexed for some reason. So naturally, that content wasn't going to have any effect if Google in not seeing it. Has anyone seen this before? Thoughts on what may be happening? I'm not seeing any errors begin thrown by the page....and I'm not aware of a limit of lines of code Google will crawl. Pages load under 5 seconds so loading speed shouldn't be the issue. Thanks, Kevin

    | yandl
    1

  • Hi Quick question on arrangement of keywords in titles. I know the order isn't so important anymore, but would there be a real issue if I want to rank for 'Henry Xtra' but my title reads 'Numatic Henry Xtra Vacuum Cleaner' Rather than 'Henry Xtra Vacuum Cleaner' ?? Will it really make much difference? Thank you!

    | BeckyKey
    0

  • Hi - I'm looking at a site using JavaScript dropdown navigation - Google can crawl the whole site but my thinking is this - If I ensure the dropdown navigation is functioning fully when JS is switched off, I may facilitate the search engine bots? At the moment I can't get any dropdown effect if I turn JS off on the site but if I look at a cached page (text version) the dropdown links are visible and working. I am wondering whether any crawl benefit is there if you take this a step further and ensure the drop downs are actually visible and working when JS is switched off? I would welcome your thoughts on this. Thanks in advance, Luke - 07966 729775

    | McTaggart
    0

  • Howdy lovely Moz people. A webmaster redirected https protocol links to http a number of years ago in order to try and capture as many links as possible on a site we now manage. We have recently tried to implement https and realised that because of this existing redirect rule, they are now causing infinite loops when trying to test an http redirect. http redirecting to https redirecting back to http, etc. The https version works by itself weirdly enough. We believe that this is due to the permanent browser caching. So unless users clear their cache, they will get this infinite loop. Does anyone have any advice on how we can get round this? a) index both sites and specify in GSC that the https is the canonical version of the site and hope that Google sees that and removes the http version for the https version b) stick with http as infinite loops will kill the site c) ??????????? Thanks all.

    | HenryFrance
    0

  • Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?

    | SudipG
    0

  • Since a few days I'm having some concernes on our website structure regarding SEO.  Since I can't find similar cases I'm curious if the Moz community maybe have a few thoughts on the issue I'm facing The situation is as follow: For every new client our company (hosting) receives through www.example.com a new subdomain is created. This subdomain is an backup of the original website of the client and is very much irrelevant to our business. Google can also crawl these subdomains and index them. Productvariant 1: clientxxx1.productX.example.com
    Productvariant 2: clientxxx1.productY.example.com
    Productvariant 3: cleintxx10.productZ.example.com So I think above situation is far from ideal and I think it can cause problems. The problems we could be facing where Im thinking of are: no control over content (spam, low quality, bad optimised pages) duplicate sites (the backup on our subdomain and the original one of the client) impossible to make/manage a property for each subdomain in search console. Huge amount of subdomains which could influence crawl/indexation by Google. Maybe there are some more issues we could face where I didn't think of? The most common fix would be to use an other domain for the backups like client1.host-example.com and prevent Google from crawling it. This way www.example.com wouldn't be affected. So my questions basically are: 1. How much will this influence rankings for www.example.com 
    2. Are there any similar cases and what effect did it have on rankings/crawl/indexation when it got fixed / didn't got fixed?

    | Steven87
    0

  • If I search Google for my cache I get the following: cache:http://www.saucydates.com -> Returns the cache of netball.org (HTTPS page with Plesk default page) cache:https://www.saucydates.com -> Displays the correct page Prior to this my http cache was the Central Bank of Afghanistan. For most searches at present my index page is not returned and when it is, it’s the Net Ball Plesk page. This is, of course hurting my search traffic considerably. ** I have tried many things, here is the current list:** If I fetch as Google in webmaster tools the HTTPS fetch and render is correct. If I fetch the HTTP version I get a redirect (which is correct as I have a 301 HTTP to HTTPS redirect). If I turn off HTTPS on my server and remove the redirect the fetch and render for HTTP version is correct. The 301 redirect is controlled with the 301 Safe redirect option in Plesk 12.x The SSL cert is valid and with COMODO I have ensured the IP address (which is shared with a few other domains that form my sites network / functions) has a default site I have placed a site on my PTR record and ensured the HTTPS version goes back to HTTP as it doesn’t need SSL I have checked my site in Waybackwhen for 1 year and there are no hacked redirects I have checked the Netball site in Waybackwhen for 1 year, mid last year there is an odd firewall alert page. If you check the cache for the https version of the netball site you get another sites default plesk page. This happened at the same time I implemented SSL Points 6 and 7 have been done to stop the server showing a Plesk Default page as I think this could be the issue (duplicate content) ** Ideas:** Is this a 302 redirect hi-jack? Is this a Google bug? Is this an issue with duplicate content as both servers can have a default Plesk page (like millions of others!) A network of 3 sites mixed up that have plesk could be a clue? Over to the experts at MOZ, can you help? Thanks, David

    | dmcubed
    0

  • We created a more keyword friendly url with dashes instead of underscores in December.  That new URL is in Google's Index and has a few links to it naturally.  The previous version of the URL (with underscores) continues to rear it's ugly head in the SERPs, though when you click on it you are 301'd to the new url.  The 301 is implemented correctly and checked out on sites such as http://www.redirect-checker.org/index.php. Has anyone else experienced such a thing? I understand that Google can use it's discretion on pages, title tags, canonicals, etc.... But I've never witnessed them continue to show an old url that has been 301'd to a new for months after discovery or randomly.

    | seoaustin
    0

  • Hi i have a SEO agency we work with who are building quality guest post links for us, however they are also building forum, profile, blog comments 
    and directory based links. 60% of their links they are building are high quality, relevant guest posts while the other 40% are the other link types. The 40% seem to be relevant directories, forums, blog comments, etc. They said they build other link types because it diversifies the link building and profile rather then just building high quality guest posts. As just building one link type can leave a footprint. What are your thoughts on this? Cheers.

    | spyaccounts14
    0

  • Hi, in the last couple of weeks I get more and more search results with a product and prices of retailers below (see sample attached). Are there Schema parameters one could use to have a bigger chance to appear there? Thanks in advance Dieter Lang 0EYJtRJ

    | Storesco
    1

  • Hello! So, we migrated our website in 2015 to the "new version" and we now have 20,000 old URLs that we'd like to officially retire. the only traffic coming to these pages is obviously from backlinks pointing at our site. How do i gauge the hit that our website will take once we retire these URLs? Is there a tool that will allow me to look at referral traffic numbers per URL so that i know how much traffic we'll be losing? any advice would be helpful! thanks! yael

    | yaelslater
    0

  • My google search console states some errors as below: 1. Article fragmented Some of the urls in this error are the category urls. How to make google bot understand it is a category not an article? 2. Article too short In fact the article is quite long. I do not know why this is happen... 3. No sentence found In fact, there are a lot of sentences Please help!

    | binhlai
    0

  • One of our top organic landing page was set as "NOINDEX,NOFOLLOW" by "mistake". I took me about a week to realize this after I saw a drop of traffic on that page. I looked on Google to see if it was indexed and my fear were confirmed! After finding our that it was switched to "NOINDEX,NOFOLLOW" I switched it back to "INDEX,FOLLOW" and did an index request in our Google Search Console. Anyone else has run into a similar issue? Did you ever got the page inxed again?

    | FrankViolette
    2

  • Hello, folks! I'm wondering how I optimize a site if it is built on a platform that works based on dynamic content. For example, the page pulls in certain information based on the information it has about the user.  Not every user will see the same page. Thanks!
    Lindsey

    | Geonetric
    0

  • Is there anyway of getting the Stars to show up for a home page? The issue is that 70% of our traffic goes to our home page and while we have stars for the product pages this only covers 30% of our traffic the other 70% goes to our home page and due to this not being a product page it does not have them. Would selling the product direct on this page allow us to have them? I appreciate any help.

    | BobAnderson
    0

  • Not sure if this is a new "unit" for Google organic results. Please see the attached image. When searching for "invoice software", the top quarter of the page is a ribbon of products/brands with badly formatted logos. The fact that it's so ugly, and there's nothing marking it as a paid result, leads me to think it's organic. Anyone know what this SERP unit is called; and better still: How do you get included? We rank super high in the normal organic results, but don't appear at all in this product ribbon. y71A9

    | RobM416
    1

  • I am working with a website that sells new and multiple grades of refurbished power tools New Refurbished Grade A (top quality refurbished) Refurbished Grade C (had a few more scuffs but in perfect working order) Refurbished Grade D (no warranty / as is conditions, typically for parts) How would you create the Products and URL structure? Since they are all technically different products they have their own sku in magento. Would you combine them into one URL with different product options? or would you give each product version its own url (New, Grade A, Grade C, Grade D) Thanks! -- Steven

    | intown
    0

  • We're taking on a redesign of our corporate site on our main domain.  We also have a number of well established, product based subdomains. There are a number of content pages that currently live on the corporate site that rank well, and bring in a great deal of traffic, though we are considering placing 301 redirects in place to point that traffic to the appropriate pages on the subdomains. If redirected correctly, can we expect the SEO value of the content pages currently living on the corporate site to transfer to the subdomains, or will we be negatively impacting our SEO by transferring this content from one domain to multiple subdomains?

    | Chris8198
    0

  • Does Google bias the type of content which ranks? HI Guys, If i wanted to create a nice blog post around a topic like: black dresses or yoga pants. If you view google.com or google.com.au results all the top ranking URLs are e-commerce pages which list the products. There is very rarely - blog content e.g. top black dresses to wear... or 7 of the hottest yoga pants on the market. The search intent is about the same i.e. someone looking for black dresses would be interested in that blog post. So in my conclusion Google has some form of bias in delivering ecommerce sites above blog/skyscrapper form of content. Thoughts? Cheers.

    | spyaccounts14
    0

  • Hi there: We maintain a calendar of digital events and conferences on our website here: https://splatworld.tv/events/ . We're trying to add as many events as we can and I'm wondering about the descriptions of each. We're pulling them from the conference websites, mostly, but I'm worried about the scraped content creating duplicate content issues. I've also noticed that most calendars of this type which rank well are not including actual event descriptions, but rather just names, locations and a link out to the conference website. See https://www.semrush.com/blog/the-ultimate-calendar-of-digital-marketing-events-2017/ and http://www.marketingterms.com/conferences/ . Anyone have any thoughts on this? Thanks, in ..advance..

    | Daaveey
    0

  • Hi Our Dev team have updated our website with a new menu structure, they have given us 2 options to choose from. 1st option I think is better for SEO - this will be showing top 8 categories and then subcategories once you hover over category 1. Not much change from our current structure, just a slightly different layout. (I have added an image example of what option1 will look like) 2nd option - is preferred by management - shows all 24 categories & no subcategories. My question is, will removing the current subcategories from the main menu make them lose rankings & make them harder to rank in future? I'm guessing everything will move down a level in the structure and lost page authority... Does anyone have any articles/case studies to prove this point? Any help is much appreciated 🙂 Becky DKzgD

    | BeckyKey
    1

  • Will Google value a link with a UTM tag the same as a clean link without a UTM tag? I should say that a UTM tag link is not a natural link so the linkvalue is zero. Anyone any idea how to look at this?

    | TT_Vakantiehuizen
    0

  • I've set up a dummy domain (Not SEO'd I know) in order to get some input on if I'm doing this correctly. Here's my option on the set up and https://technicalseo.com/seo-tools/hreflang/ is saying it's all good. I'm self-referencing, there's a canonical, and there is return tags. https://topskiphire.com - US & International English Speaking Version https://topskiphire.com/au/ - English language in Australia The Australian version is on a subdirectory. We want it this way so we get full value of our domain and so we can expand into other countries eventually e.g. UK. Q1. Should I be self-referencing or should I have only a canonical for US site? Q2. Should I be using x-default if we're only in the English language? Q3. We previously failed when we had errors come back saying 'return tags not found' on a separate site even though the tags were on both sites. Was this because our previous site was only new and Google didn't rank it as often as our main domain.

    | cian_murphy
    0

  • 90% of our sales are made with products in one of our product categories. 
    A search for main category keyword returns our root domain index page in google, not the category page.
    I was wondering whether integrating the complete main category directly in the index page of the root domain and this way including much more relevant content for this main category keyword may have a positive impact on our google ranking for the main category keyword. Any thoughts?

    | lcourse
    1

  • I have two existing e-commerce sites. The older one, is built on the Yahoo platform and had limitations as far as user experience. The new site is built on the Magento 2 platform. We are going to be using SLI search for our search and navigation on the new Magento platform. SLI wants us to 301 all of our categories to the hosted category pages they will create, that will have a URL structure akin to site.com/shop/category-name.html. The issue is: If I want to merge the two sites, I will have to do a 301 to the category pages of the new site, which will have 301s going to the category pages hosted by SLI. I hope this makes sense! The way I see it, I have two options: Do a 301 from the old domain to categories of the new domain, and have the new domain's categories 301 to the SLI categories; or, I can do my 301s directly to the SLI hosted category pages. The downside of #1 is that I will be doing two 301s, and I know I will lose more link juice as a result. The upside of #1, is that if decide not to use SLI in the future, it is one less thing to worry about. The downside of #2, is that I will be directing all the category pages from the old site to a site I do not ultimately control. I appreciate any feedback.

    | KH2017
    1

  • Hi, I'm integrating with a service that adds 3rd-party images/videos (owned by them, hosted on their server) to my site. For instance, the service might have tons of pictures/videos of cars; and then when I integrate, I can show my users these pictures/videos about cars I might be selling. But I'm wondering how to build out the sitemap--I would like to include reference to these images/videos, so Google knows I'm using lots of multimedia. How's the most white-hat way to do that? Can I add external links to my sitemap pointing to these images/videos hosted on a different server, or is that frowned upon? Thanks in advance.

    | SEOdub
    0

  • Hi there, We have product category descriptions in our webshop, with one smal difference: desktop: the first paragraf is at the top above the products with a scroll down to the other paragrafs below the products mobile: the whole text is below the products I am wondering whether it is allright in regards of the mobile first index, or should we have exactly the same paragraf split on the desktop and mobile version? Eg. dadum.pl/zabawki Thanks in advance for your opinions. Isabelle

    | isabelledylag
    0

  • I have a page which (I believe) is well optimised for a specific keyword (URL, title tag, meta description, H1, etc). yet Google chooses to display the home page instead of the page more suited to the search query. Why is Google doing this and what can I do to stop it?

    | muzzmoz
    0

  • Hi Guys, In this video by Brian Dean he talks about how to go about optimising for multiple keywords. He basically said the main factors for optimising a page for multiple keywords are the following: Identify other keywords with same search intent as your primary keyword. Add them to title tag strategically, don't stuff them in there. Add as many of those keywords as h2 tags into the content, again when it makes sense. Are there any other more advanced ways you can use to optimize a page for multiple keywords with same search intent that could be good? Any suggestions would be very much appreciated! Cheers.

    | spyaccounts11
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.