Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • We have a blog currently powered by the free blogger.com website. We have set it up as blog.example.com we wish to seti it up as example.com/blog how can we do this using .htaccess file? we understand how to update htacess, but we don't know what code we should enter to achieve what we want our website is hosted on Apache servers with plesk control panel

    | Direct_Ram
    0

  • I currently own an exact match domain for my keyword.  I have it set up with multiple pages and also a blog.   The home page essentially serves as a hub and contains links to all the pages and the blog. My targeted keyword is on its own page and I made the title tag the same as my keyword. As an example the URL for my targeted post looks like this: benefitsofrunningshoes.com/benefits-of-running-shoes I have solid, non-spammy content and clean whitehat earned backlinks directing to that specific page. My concern right now is that the URL looks kinda spammy.  The website has been live for about a week and the home page ranks well enough but my targeted page is no where to be found.  (it does show up if I manually search via search command "site:benefitsofrunningshoes.com"). I'm wondering if it is acceptable to use the exact keyword in title tag / page url if it is also in the domain as an EMD? Should I change the title tag and leave the URL in?  Or should I completely change the title tag and URL and 301 redirect to the new page?  I appreciate any help!

    | Kusanagi17
    0

  • Now that we can search by image on Google and see every site that is using the same photo, I assume that Google is going to use this as a signal for ranking as well. Is that already happening? I ask because I have sold many photos over the years with first-use only rights, where I retain the copyright. So I have photos on my site that I own the copyright for that are on other sites (and were there first). I am not sure if I should make an effort to remove these photos from my site or if I can wait another couple years.

    | Lina500
    0

  • I just ran the link opportunity option within site explorer and it shows that 31 pages are currently in a 302 status.  Should I try to convert the 302's to 301's?  And what is the easiest way to do this? I see several wordpress plugins that claim to do 301 redirects but I don't know which to choose. Any help would be greatly appreciated!

    | vmsolu
    0

  • Hey guys! Need your many awesome brains. 🙂 This may be a very basic question but am hoping you can help me out with some insights beyond "because Google says it's better". 🙂 I only recently started working with SEO, and I work for a SaaS website builder company that has millions of open/active user sites, and all our user sites URLs, instead of www.mydomainname.com/gallery or myusername.simplesite.com/about, we use numbers, so www.mysite.com/453112 or myusername.simplesite.com/426521 The Sales manager has asked me to figure out if it will pay off for us in terms of traffic (other benefits?) to change it from the number system to the "proper" and right way of setting up these URLs. He's looking for rather concrete answers, as he usually sits with paid search and is therefore used to the mindset of "if we do x it will yield us y in z months". I'm finding it quite difficult to find case studies/other concrete examples beyond the generic, vague implication that it will simply be "better" (when for example looking at SEO checklists and search engine guidelines). Will it make a difference? How so? I have to convince our developers of the importance and priority of this adjustment, or it will just drown in the many projects they already have. So truly, any insights would be so very welcome. Thank you!

    | michelledemaree
    2

  • Hi all, When I looked in Google Webmaster Tools today I found under the menu Google Index, Content Keywords, that the list is full of spammy keywords (E.g. Viagra (no. 1) and stuff like that) Around april we built a whole new website, uploaded a new xml-sitemap, and did all the other things Google Webmaster Tools suggest when one is creating a Google Webmaster Account. Under the menu "Security Issues" nothing is mentioned. All together I find it har d to believe that the site is hacked - so WHY is Google finding these content keywords on our site?? Should I fear that this will harm my SEO efforts? Best regards, Christian

    | Henrik_Kruse
    0

  • My employer is shifting to a new domain and i am in the midst of doing URL mapping. I realize that many of the meta descriptions and H1 tags are different on the new pages - is this a problem ? Thank you.

    | ptapley
    0

  • I was running a rapport with Sreaming Frog SEO Spider and i saw: (Tab) Directives > NOindex : https://compleetverkleed.nl/sitemap_index.xml/  is set on X-Robots-Tag 1 > noindex,follow Does this mean my sitemap isn't indexed? If anyone has some more tips for our website, feel free to give some suggestions 🙂 (Website is far from complete)

    | Happy-SEO
    2

  • Hi all, I got a big problem with my website. I have a lot of page, duplicate page made from various combinations of selects, and for all this duplicate content we've be hit by a panda update 2 years ago. I don't want to bring new content an all of these pages, about 3.000.000, because most of them are unnecessary. Google indexed all of them (3.000.000), and I want to redirect the pages that I don't need anymore to the most important ones. My question, is there any problem in how google will see this change, because after this it will remain only 5000-6000 relevant pages?

    | Silviu
    0

  • Hi Guys, This is a follow up on this thread: http://moz.com/community/q/dynamic-url-parameters-woocommerce-create-404-errors# I would like to know how I can set a canonical link in Wordpress/Woocommerce which points to "View All" on category pages on our webshop.
    The categories on my website can be viewed as 24/48 or All products but because the quanity constantly changes viewing 24 or 48 products isn't always possible. To point Google in the right direction I want to let them know that "View All" is the best way to go.
    I've read that Google's crawler tries to do this automatically but not sure if this is the case on on my website. Here is some more info on the issue: https://support.google.com/webmasters/answer/1663744?hl=en
    Thanks for the help! Joost

    | jeeyer
    0

  • I get strange :443 errors in my 404 monitor on Wordpress https://www.compleetverkleed.nl:443/hoed-al-capone-panter-8713647758068-2/
    https://www.compleetverkleed.nl:443/cart/www.compleetverkleed.nl/feestkleding
    https://www.compleetverkleed.nl:443/maskers/ I have no idea where these come from :S

    | Happy-SEO
    2

  • There have been a couple other threads concerning this topic so I apologize, but I have an iteration on the main question that has not been answered. Crawl Diagnostics is giving me a bunch of 302 temporary redirect notices. For example, here is a page title URL:
    http://store.in-situ.com/Rugged-Conductivity-Meter-p/0073380.htm and here is the redirect:
    http://store.in-situ.com/Rugged-Conductivity-Meter-p/tape-clt-meter.htm?1=1&CartID=0 The first link is actually a child product of:
    http://store.in-situ.com//Rugged-Conductivity-Meter-p/tape-clt-meter.htm Volusion tech support told me they believe most of them are meta redirects but could not find any documentation on them. All the other threads concerning this have said to either change the 302s to 301s, which I don't think is possible, or to add a nofollow tag. My question is do I need to do anything if both those pages are canonical to the parent product? Should I be passing on the linkjuice if neither of those pages are of high value?

    | anneoaks
    0

  • One website I manage wants to redirect users to state specific pages based on their location. What is the best way to accomplish this? For example a user enters the through site.com but they are in Colorado so we want to direct them to site.com/colorado.

    | Firestarter-SEO
    0

  • We just removed an entire product category on our website, (product pages still exist, but will be removed soon as well) Should we be setting up re-directs, or can we simply delete this category and product 
    pages and do nothing? We just received this in Google Webmasters tools: Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. We have not updated the sitemap yet...Would this be enough to do or should we do more? You can view our website here: http://tinyurl.com/6la8 We removed the entire "Spring Planted Category"

    | DutchG
    0

  • Google indexed a bunch of our URL  parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!

    | BT2009
    0

  • Hi Guys, My company has a franchise of a foreign company that uses an accent/foreign letter in its brand name. We have to refer to this franchise with this symbol on our website to meet their standards. I've done some research on this but its not conclusive, so i was wondering whether anyone here can confirm this for me; Will using the letter with this symbol impair our rankings for this franchise name? Obviously as a UK business people search for this franchise with a regular letter and not the accented one. I would have thought that Google is clever enough to recognise the meaning of the accented letter by now and therefore it wouldn't affect rankings (much). Furthermore, do you think that it would make any difference to use the HTML element to represent the accent rather than copy and pasting the symbol onto our website? I would've thought this would help Google pick it up, but it might not make a difference anyway! Any help is appreciated. Thanks Sam

    | Sandicliffe
    1

  • We noticed today that 4 of the top referring sites are actually porn sites. Does anyone know what that is all about? Thanks!

    | thinkcreativegroup
    1

  • Ok, I'm trying to establish some business rules of syntax for SEO friendly URLS. I'm doing this for an OpenCart online store which uses a SEO-url field to construct the "friendly URL's".  The good news of that is I have total control over the urls' the bad news is I had to do some tricky Excel work to populate them. That all said, I have a problem with items that have sizes. This is a crafts store so many of the items are differentiated by size. Examples: Sleigh Bells, come in 1/2", 3/4", 1", 1 1/2" etc. So far Ive tried to stay away from inch mark "  by spelling it out.  Right now its inch but could be in. The numbers, fractions, sizes etc. create some ghastly friendly URL's.  Is there any wisdom or syntax standards out there that would help me. I'm trying to avoid this: www.mysite.com//index.php?route=craft-accessories/bells/sleigh-bells/sleigh-bells-1-one-half-inch-with-loop I realize that the category (sleigh-bells) is repeated in the product name but there are several 1 1/2" items in the store. Any thoughts would be useful, even if it's links to good SEO sites that have mastered the myriad of issues with dimensions in the urls. thanks

    | jbcul
    0

  • We're looking to build some serious content and capitalise on long-tail keyword traffic for our sub-category pages, example for targeted keyword "designer dining tables". Example of current link: www.website.com/designer-furniture/designer-dining-tables.html Would removing the category paths help? Example result - www.website.com/designer-dining-tables More user friendly URLs and better for SEO would you suggest? The only problem is, if we removed the paths would this have a hit on our traffic? Any advice would be much appreciated. We are using Magento platform.

    | Jseddon92
    0

  • Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H

    | HiteshP
    0

  • Hi All, Since last two days I am seeing a very strange keyword appearing in Google Analytics. Why such keyword appearing in GA? any idea? Please see keyword in attachment. Thanks ay6hH6z

    | Alick300
    0

  • Hi! I'm looking to include rich snippets on some of my product sites, such as price etc. In addition, it would be nice to include our overall ratings (from Trustpilot) on the different pages. 
    However, I've been looking all over, and haven't really found a clear answer, as to if this is even in adherence with the Google guidelines. As it is our company overall, and not the specific products that are being rated, I have done it likes this (on product pages): name of organization
    248
    8,2
    10. other product-specific information Would this be against guidelines?

    | eyephone
    0

  • I have a sitemap-index.xml file in the root. I then have several sitemaps linked to from the index in example.com/sitemaps/sitemap1.xml, example.com/sitemaps/sitemap2.xml, etc. I have seen on other sites that for example a sitemap containing blogs where the blogs are located at example.com/blog/blog1/ would be located at example.com/blog/sitemap.xml. Is it necessary to have the sitemap located in the same folder like this? I would like to have all sitemaps in a single sitemap folder for convenience but not if it will confuse search engines. My index count for URLs in some sitemaps has dropped dramatically in Google Webmaster Tools over the past month or so and I'm not sure if this is having an effect. If it matters, I have all sitemap files, including the index, listed in the robots.txt file.

    | Giovatto
    0

  • I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.

    | TIM_DOTCOM
    0

  • Ok, so I understand Google doesn't use stopwords (like "a" or "the"). Therefore if I am optimize for a keyword phrase, and say find an opportunity for say: "how to create stuff something" But it actually reads better as (although doesn't sound completely out of place as above) "how to create stuff in something" Which is better for SEO? (ignore usability \ readability in your replies please and assume it reads reasonably either way as that was just an example)

    | TheWebMastercom
    0

  • For example if we have a domain www.bobshardware.com.au and we setup a subdomain sydneysupplies.bobshardware.com.au and then brisbanescrewdrivers.bobshardware.com.au and used those in ad campaigns. Each subdomain being redirected back to a single page such as bobshardware.com.au/brisbane-screw-drivers  etc. Is there a benefit ? Cheers

    | techdesign
    0

  • I’m building a WordPress site with Visual Composer and I’ve hit a point where I need to show a totally different section on a mobile compared to a desktop/tablet. My issue/question comes from the fact that both mobile and desktop rows will have the same content as well as H1/H2/H3 tags. From inspecting the elements I see the mobile only rows are hidden until the page size shrinks through being set to 'display: none' in the CSS (standard visual composer way of handling width & responsiveness) How will Google see this in terms of SEO? I don’t want to come across as if I’m cloaking text and H1 tags on the page (I have emailed the visual composer support but wanted to get an external opinion)

    | shloy23-294584
    0

  • Lately my organic traffic has dropped significantly as well as my adsense revenue.  The moz report says, for example, my traffic is down 40%, but I a still #1 for that keyword. Also, in the last week, suddenly my number of indexed pages doubled.  We had done some page rewriting and maybe messed that up.  We've fixed that though.  Webmaster tools is still picking up all of our old pages and the new ones. Background: We recently launched our new responsive website in March. March income was about the same as February. April dropped off suddenly (maybe late march - no sure) When we changed site, we did do 301's for all the old pages to the new ones Any ideas or advice as to why my traffic and revenue has dropped off so sharply? Never submitted questions before - not sure if I am supposed to put urls here so if you just google Home Spelling Words - that's my website.  Thanks everyone!!!

    | kimtastic
    0

  • Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere.  I have a site that at one point had several development sites set up at subdomains.  Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index.  However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com."  Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index?  Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help.  Thanks!!

    | SarahLK
    0

  • Hi there, We have a company website based on Wordpress. I just noticed that under Settings > Permalinks I can configure the look of the URLs and even remove the trailing slash. We have about 2-300 pages online. If I remove the trailing slash now, will that negatively impact our SEO in anyway for existing pages? Thanks!

    | Amr-Haffar
    0

  • I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.

    | blrs12
    0

  • Our web development team have changed our domain prefix from www to non www due to a server change. Our SSL certificate would not be recognised under www and would produce a substantial error message when visiting the secure parts of our website. To prevent issues with old links they have added a permanent 301 redirect from www. to non www. urls until our sitemap catches up. Would this impact our SEO efforts or would it have no impact as a redirect has been placed? Thanks

    | Jseddon92
    0

  • Our products have about 4 PDFs a piece, which really inflates our indexed pages. I was wondering if I could add REL=No Index to the PDF's URL? All of the files are on a file server, so they are embedded with links on our product pages. I know I could add a No Follow attribute, but I was wondering if any one knew if the No Index would work the same or if that is even possible. Thanks!

    | MonicaOConnor
    0

  • I have a link directory called Liberty Resource Directory. It's the main site on my dedicated IP, all my other sites are Addon domains on top of it. While exploring the new MOZ spam ranking I saw that LRD (Liberty Resource Directory) has a spam score of 9/17 and that Google penalizes 71% of sites with a similar score. Fair enough, thin content, bunch of follow links (there's over 2,000 links by now), no problem. That site isn't for Google, it's for me. Question, does that site (and linking to my own sites on it) negatively affect my other sites on the same IP? If so, by how much? Does a simple noindex fix that potential issues? Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/

    | eglove
    0

  • I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?

    | One2OneDigital
    0

  • I work for a travel site and we have pages for properties in destinations and am trying to decide how best to organize the URLs basically we have our main domain, resort pages and we'll also have articles about each resort so the URL structure will actually get longer:
    A. domain.com/main-keyword/state/city-region/resort-name
    _    domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village_ _   domain.com/main-keyword-in-state-city/resort-name-feature    _
    _   domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village/kid-friend-pool_ B. Another way to structure would be to remove the location and keyword folders and combine. Note that some of the resort names are long and spaces are being replaced dynamically with dashes. 
    ex. domain.com/main-keyword-in-state-city/resort-name
    _      domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village_ _      domain.com/main-keyword-in-state-city/resort-name-feature_
    _      domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village-kid-friend-pool_ Question: is that too many folders or should i combine or break up? What would you do with this? Trying to avoid too many dashes.

    | Vacatia_SEO
    0

  • Occasionally I see the our 'listings' on Google where the Title line shows up with dashes... like sony-professional-hard-drive - TapeandMedia.com It appears to be the URL shortened and rehashed.  This example was after I searched for "Sony PSZ-HA1T"  without the quotes.  The title for this page is <title></span><span class="html-tag">Sony 1TB Professional Portable External Hard Disk Drive (PSZ-HA1T)</span><span class="html-tag"></title> and the url is http://www.tapeandmedia.com/sony-1tb-professional-portable-hard-drive.asp Link to image: http://i.imgur.com/FmvAn6c.jpg Other searches (like "Sony 1tb PSZ-HA1T") yield normal looking SERP Titles Does anyone know why this happens and what I can do to avoid this? FmvAn6c.jpg

    | BWallacejr
    0

  • Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing.  You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page.  Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice.  Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!

    | Emory_Peterson
    0

  • Hi, I was wondering what is the best practice to redirect all the links juice by redirecting all the pages of your website to a coming soon page. The coming soon page will point to the domain.com, not to a subfolder. Should I move the entire website to a subfolder and redirect this folder to the coming soon page? Thanks

    | bigrat95
    0

  • Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen

    | Digital-DMG
    0

  • We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google.  See the screenshot for an example.  The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index.  Are there extra steps we should take?  So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png

    | yoursearchteam
    0

  • Hey there guys, have heard some recent information from some experts that utilizing commas in headings, meta titles or descriptions is not good for ranking. Can you guys please shed some light on this? Thank you!

    | MrGlobalization
    0

  • Screaming Frog is showing a 503 code for images. If I go and use a header checker like SEOBook it shows 200. Why would that be? Here is an example link- http://germanhausbarn.com/wp-content/uploads/2014/07/36-UPC-5145536-John-Deere-Stoneware-Logo-Mug-pair-25.00-Heavy-4-mugs-470x483.jpg

    | EcommerceSite
    0

  • I have a site with woocommerce. Do I need to block the cart page?

    | EcommerceSite
    0

  • I've been pushing hard to get our Americas site (DA 34) integrated with our higher domain authority (DA 51) international website. Currently our international website is setup in the following format... website.com/us-en/ website.com/fr-fr/ etc... The problem that I am facing is that I need my development framework installed in it's own directory. It cannot be at the root of the website (website.com) since that is where the other websites (us-en, fr-fr, etc.) are being generated from. Though we will have control of /us-en/ after the integration I cannot use that as the website main directory since the americas website is going to be designed for scalability (eventually adopting all regions and languages) so it cannot be region specific. What we're looking at is website.com/[base]/us-en. I'm afraid that if base has any length to it in terms of characters it is going to dilute the SEO value of whatever comes after it in the URL (website.com/[base]/us-en/store/product-name.html). Any recommendations?

    | bearpaw
    0

  • Hello, We have a site that is built using an AJAX application. We include the meta fragment tag  in order to get a rendered page from PhantomJS. The URL that is rendered to google from PhantomJS then is www.oursite.com/?escaped_fragment= In the SERP google of course doesnt include the hashtag in the URL. So my question, with this setup, do i still need a canonical tag and if i do, would the canonical tag be the escaped fragment URL or the regular URL? Much Appreciated!

    | RevanaDigitalSEO
    0

  • If you do a search for my name on Google, the first result is the author archive page of my Wordpress blog. I would like to redirect the author page to my  "about me" page but cannot add a 301 as the author page is created dynamically in Wordpress. Anyone know how I can do this?

    | richdan
    0

  • This has stumped the Wordpress staff and people in the Google Webmasters forum. We are in Google News (have been for years), and so new posts are crawled immediately. On Feb 17-18 Crawl Stats dropped 85%, and new posts were no longer indexed (not appearing on News or search). Data highlighter attempts return "This URL could not be found in Google's index." No manual actions by Google. No changes to the website; no custom CSS. No Site Errors or new URL errors. No sitemap problems (resubmitting didn't help). We're on wordpress.com, so no odd code. We can see the robot.txt file. Other search engines can see us, as can social media websites. Older posts still index, but loss of News is a big hit. Also, I think overall Google referrals are dropping. We can Fetch the URL for a new post, and many hours later it appears on Google and News, and we can then use Data Highlighter. It's now 6 days and no recovery. Everybody is stumped. Any ideas? I just joined, so this might be the wrong venue. If so, apologies.

    | Editor-FabiusMaximus_Website
    0

  • Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?

    | Whittie
    0

  • Hello Moz community, I'd like to know if you build "pre-sales seo audit" when selling your services to a prospect. I think the main idea of a pre-sales audit is to show your prospect that you understand his industry (trends & competition) understand the opportunities know the roadblocks on his website If so, i'd be interested in discussing the information you put into your pre-sales audit and how you organise it. If you know ressources i should read as regards to mini seo audit / pre-sales seo audit just paste the link 🙂 Thanks for your answers

    | Sindicic_Alexis
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.