Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi everyone, I have a global client with lots of duplicate page issues, mainly because they have duplicate pages for US, UK and AUS... they do this because they don't offer all services in all markets, and of course, want to show local contact details for each version. What is the best way to handle this for SEO as clearly I want to rank the local pages for each country. Cheers

    | Algorhythm_jT
    0

  • Hello, Currently, we have a .com English website serving an international clientele. As is the case we do not currently target any countries in Google Search Console. However, the UK is an important market for us and we are seeing very low traffic (almost entirely US). We would like to increase visibility in the UK, but currently for English speakers only. My question is this - would geo-targeting a subfolder have a positive impact on visibility/rankings or would it create a duplicate content issue if both pieces of content are in English? My plan was: 1. Create a geo-targeted subfolder (website.com/uk/) that copies our website (we currently cannot create new unique content) 2. Go into GSC and geo-target the folder to the UK 3. Add the following  to the /uk/ page to try to negate duplicate issues. Additionally, I can add a rel=canonical tag if suggested, I just worry as an already international site this will create competition between pages However, as we are currently only targeting a location and not the language at this very specific point, would adding a ccTLD be advised instead? The threat of duplicate content worries me less here as this is a topic Matt Cutts has addressed and said is not an issue. I prefer the subfolder method as to ccTLD's, because it allows for more scalability, as in the future I would like to target other countries and languages. Ultimately right now, the goal is to increase UK traffic. Outside of UK backlinks, would any of the above URL geo-targeting help drive traffic? Thanks

    | Tom3_15
    0

  • Hello, I have a question concerning maintenance & pruning content with a large site that has a ton of pages that are either expired OR reoccurring. Firstly, there's ~ 12,000 pages on the site. They have large sections of the site that have individual landing pages for time-sensitive content, such as promotions and shows. They have TONS of shows every day, so the # of page to manage keeps exponentially increasing. Show URLs: I'm auditing the show URLs and looking at pages that have backlinks. With those, I am redirecting to the main show pages. 
    -However, there are significant # of show URLs that are from a few years ago (2012, 2013, 2014, 2015) that DON'T get traffic or have any backlinks (or ranking keywords). Can I delete these pages entirely from the site, or should I go through the process of 410-ing them (and then deleting? or ...?)Can you let 410's sit?)? They are in the XML sitemap right now, so they get crawled, but are essentially useless, and I want to cut off the dead weight, but I'm worried about deleting a large # of pages from the site at once. For show URLs that are still obsolete, but rank well in terms of kewyords and get some traffic...is there any recommended option? Should I bother adding them to a past shows archive section or not since they are bringing in a LITTLE traffic? Or ax them since it's such a small amount of traffic compared to what they get from the main pages. There are URLs that are orphaned and obsolete right now, but will reoccur. For instance, when an artist performs, they get their own landing page, they may acquire some backlinks and rank, but then that artist doesn't come back for a few months. The page just sits there, orphaned and in the XML sitemap. However, regardless of back-links/keywords, the page will come back eventually. Is there any recommended way to maintain this kind of situation? Again, there are a LOT of URLs in this same boat. Promotional URLs: I'm going about the same process for promotions and thankfully, the scale of hte issue is much less. However, same question as above...they have some promotional URLs, like NYE Special Menu landing pages or Lent-Specials, etc, for each of their restaurants. These pages are only valid for a short amount of time each year, and otherwise, are obsolete. I want to reuse the pages each year, though, but don't want them to just sit there in the XML sitemap. Is there ever an instance where I might want to 302 redirect them, and then remove the 302 for the short amount of time they are valid? I'm not AS concerned about the recycled promotional URLs. There are much fewer URLs in this category. However, as you can probably tell, this large site has this problem of reoccurring content throughout, and I'd like to get a plan in place to clean it up and then create rules to maintain. Promotional URLs that reoccur are smaller, so if they are orphaned, not the end of the world, but there are thousands of show URLs with this issue, so I really need to determine the best play here. Any help is MUCH appreciated!

    | triveraseo
    0

  • hello, which better to rank with 40 DA domain redirect the domain 301 to my website or host domain and create posts with my website link + if i do the 301 redirect the Crawl Errors of old 40 da domain will display on my new website or not+how much links can i get from one website pbn
    +
    which better get links for home page or postsbest regards ,

    | cristophare79
    0

  • Hello all, So I'm doing some technical SEO work on a client website and wanted to crowdsource some thoughts and suggestions. Without giving away the website name, here is the situation: The website has a dedicated /resources/ page. The bulk of the Resources are industry definitions, all encapsulated in colored boxes. When you click on the box, the definition opens in a lightbox with its own unique URL (Ex: /resources/?resource=augmented-reality). The information for these colored lightbox definitions is pulled from a normal resources page (Ex: /resources/augmented-reality/). Both of these URLs are indexed, leading to a lot of duplicate indexed content. How would you approach this? **Things to Consider: ** -Website is built on Wordpress with a custom theme.
    -I have no idea how to even find settings for the lightbox (will be asking the client today).
    -Right now my thought is to simply disallow the lightbox URL in robots.txt and hope Google will stop crawling and eventually drop from the index.
    -I've considered adding the main resource page canonical to the lightbox URL, but it appears to be dynamically created and thus there is no place to access (outside of the FTP, I imagine?). I'm most rusty with stuff like this, so figured I'd appeal to the masses for some assistance. Thanks! -Brad

    | Alces
    0

  • Just a quick question re implementation of JSON-ID breadcrumbs You are here: Acme Company → Electronics → Computers → Laptops So in this example laptops is my current page without a link on the visible on-page breadcrumb. When implementing JSON-LD BreadcrumbList should  Laptops be included in the schema snippet, or commence from Computers to home?

    | MickEdwards
    0

  • Hey guys, I'm having an issue for the past few months. I keep getting "/feed" broken links in Google Search Console (screenshot attached). The site is a WordPress site using the YoastSEO plugin for on-page SEO and sitemap. Has anyone else experienced this issue? Did you fix it? How should I redirect these links? s7elXMy

    | Extima-Christian
    0

  • Has anyone used Botify? Is this type of software necessary for a site with under 5K pages?

    | SoulSurfer8
    0

  • SEO/Moz newbie here! My organisation's website (dyob.com.au), uses an API integration to pull through listings that are shown in the site search. There is a high volume of these, all of which only contain a title, image and contact information for the business. I can see these pages coming up on my Moz accounts with issues such as duplicate content (even if they are different) or no description. We don't have the capacity to fill these pages with content. Here's an example: https://www.dyob.com.au/products/nice-buns-by-yomg I am looking for a recommendation on how to treat these pages. Are they likely to be hurting the sites SEO? We do rank for some of these pages. Should they be noindex pages? TIA!

    | monica.arklay
    0

  • Hi all, We have our docs in a subdomain on GitHub and looking to move it into a subfolder. Github offers the ability to redirect via CNAME https://gitbookio.gitbooks.io/documentation/platform/domains.html I am getting conflicting information on if this will work or cause duplicate content and hurt our SEO.

    | kate.hassey
    0

  • My website is being indexed with both https - https with www. and no leader at all.  example.  https//www.example.com and https//example.com and example.com 3 different versions are being indexed.  How would I begin resolving this?  Hosting?

    | DigitalRipples
    0

  • Hi all, I hope somebody can share their thoughts on the below. A web designer launched my client's new website and I have been tasked with the SEO. I was approached with an immediate problem, www.clientswebsite.co.uk was ranking 9th for their company name after being indexed by Google. The search results above www.clientswebsite.co.uk were related to my client but not all, for example a direct competitor was also ranking. I have been working on the SEO for 2-3 weeks and I just managed to get to 3rd position for the company name, and then www.clientswebsite.co.uk disappeared from page 1! And now instead, an irelevant sub page is now ranking for the company name on page 2 (a contact page). I have checked and the home page is still indexed (did a site: check). The only problem software picks up is a redirect chain (http://homepage -> http://www.homepage  -> https://homepage) the web developers said it wouldn't impact rankings (when I asked them to edit the htaccess file to fix it) I've listed below the SEO tasks I completed whilst attempting to rank the company name: I set up analytics and webmaster tools, in which I set up preferred domain (www) Added a sitemap Edited meta data making sure company name was included I contacted the websites above www.clientswebsite.co.uk that were relevant and asked them to place a link linking to their new website, I was successful with a couple of these. I placed www.clientswebsite.co.uk on all of their social media profiles I reformatted headers on their home page, making sure the H1 included my client's company name I found 2 extra versions of my client's home page (not exact copies, but very similar content) that had been published, so I decided to 301 redirect these to the correct home page Activated SSL and forced to HTTPS I would really appreciate it if anyone could share their thoughts here, whether it be explanations or possible solutions Adam

    | SO_UK
    0

  • My Site name GiftaLove.com Desktop version - https://www.giftalove.com/
    Mobile version - https://m.giftalove.com/ How to enable mobile first Indexing in Desktop and Mobile version sites. Not found any message from both sites desktop and mobile version. Please resolve my Issue.

    | Packersmove
    0

  • Why does Google's search results display my home page instead of my target page?

    | h.hedayati671236541
    0

  • Hi, My e-commerce based site https://www.giftalove.com/ For SEO, What is the best way to solutions of out of stock product.

    | Packersmove
    0

  • Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!

    | Jason-Reid
    0

  • We changed our primary domain from vivitecsolutions.com to vivitec.net.  Google is indexing our new domain, but still has our old domain indexed too.  The problem is that the old site is timing out because of the https: Thought on how to make the old indexing go away or properly forward the https?

    | AdsposureDev
    0

  • Hi all! Currently we are implementing the href lang tag. I'm not really sure how to solve this: We sell our products in the Netherlands and Belgium. For the Netherlands we have 1 category page for pebbles (stones) which contain both rounded and non-rounded pebbles. In the Netherlands there is not really a difference between them (people search for pebbles and that's it). The URL: https://www.website.com/nl/pebbles. In Belgium there is a difference (people specifically search for rounded/non-rounded pebbles). Therefore, in Belgium we have 2 pages (we don't have an overall page): https://www.website.com/be/pebbles-rounded.
    https://www.website.com/be/pebbles-non-rounded. My question now is, what to do with the hreflang tags on these pages? Thanks in advance! Best, Remco

    | AMAGARD
    0

  • Hi, i want to redirect from this old link http://www.g-store.gr/product_info.php?products_id=1735/ to this one https://www.g-store.gr/golf-toualetas.html I have done several attempts but with no result. I anyone can help i will appreciate. My website runs in an Apache server with cpanel. Thank you

    | alstam
    0

  • I analyse my Birthday Page "https://www.giftalove.com/birthday"with comapare link profiles and found that total Internal Link 47,234. How my internal link suddenly increse. Please provide my details about my internal links.

    | Packersmove
    0

  • Hi how do you find whois type ownerhsip details etc etc for a .co domain ? are they on who is etc or a different system ? All Best Dan

    | Dan-Lawrence
    0

  • When you search for software-related terms, Google serves a carousel-style item above the organic listings. I've been assuming this is part of the Google Knowledge Graph. See http://imgur.com/a/TQpM5 The mystery lies in the selection of companies that are included in this carousel: the brands included are completely unrelated to the brands that rank well in the "normal" organic search listings. Giving that more color, here are the top 5 brands in "normal" organic search listings, for the query "invoice software," noting where they appear in the carousel:      BRAND    //  normal organic search position //   carousel position Wave                      //       1st    //       not present in carousel Quickbooks           //       2nd    //       1 NCHSoftware        //       3rd    //       not present in carousel Invoicely                 //       4th    //       not present in carousel Freshbooks            //       5th    //       not present in carousel Inversely, here are the top 5 in the carousel, noting where they appear in the "normal" organic search listings, again for the query "invoice software": BRAND    //  carousel position //   normal organic search position Quickbooks             //       1st    //       2nd Zoho Office           //       2nd    //       9th Invoicera              //       3rd    //          15th Hiveage                //       4th    //          30th Apptivo                 //       5th    //       not present in first 10 pages of search results Here's an annotated image of the same thing: http://imgur.com/a/0Pa6j
    --I represent a brand (I'd rather not mention which) that ranks super well in the normal organic search listings, but doesn't rank accordingly in the knowledge graph carousel. WHAT WE'VE TRIED implemented Schema markup updated Wikidata information updated Wikipedia information We've reviewed the code on the pages that DO appear in the carousel, and haven't found any common threads. Several weeks have passed since those initiatives. We've been re-indexed in that time, but are still not included in the search carousel. I've seen people suggest resources like this
    https://searchenginewatch.com/sew/how-to/2299454/4-google-carousel-optimization-tips
    but those tips aren't addressing this problem. Also, things like claiming your business or other local business validation (which we've done) don't seem material to a non-local product search (but stranger things have happened, I know...). A couple of SEOs have suggested AMP, but I've yet to see any authoritative info suggesting a link between AMP and desktop results. Any help appreciated! I will report back on my findings. This is a follow-up to a question I previously posted here: https://moz.rankious.com/_moz/community/q/kinds-of-organic-search-results-google TQpM5 0Pa6j

    | RobM416
    0

  • Fair warning, this is going to be long, but necessary to explain the situation and what has been done. I will take ANY suggestions, even if I have tried them already. We have a sister site in Australia, targeting Australian traffic. I have inherited what seems to be an incredible rat's nest. I've fixed over two dozen issues, but still haven't seemed to address the root cause. NOTE: Core landing pages have weak keyword targeting. I don't expect much here until I fix this. The main issues I'm trying to resolve first are with the unusual US-based targeting, and the inability of the homepage to rank for anything. The site is www[dot]castleford[dot]com[dot]au. Here's the rundown on what's going on: Problems: The site ranks for four times as many keywords in the US as it does in Australia. The site ranks for a grand total of 5 keywords on the first page for AU keywords. The homepage, while technically optimized on-page for "content marketing agency", and with content through MarketMuse, has historically ranked between 60-100, despite having a fairly strong DA with fairly weak competitors, based on AHREFs keyword difficulty, and Moz keyword difficulty. Oddly, the ranking has gone up to 5-7 for three day spurts over the past year. Infrequent indexing of homepage (used to be every 2-3 weeks, I've gotten that down to 1 week). Sequence of events: November 2017 - they made some changes to their URLs - some on the blog and some on the top nav LPs. Redirects seem okay. November 2017 - Substantial number of lost referring domains, not many seem to be quality. January 2018 - total number of AU ranking keywords more than halved. May/June 2018 - added a follow inbound link sitewide to an external site that they created. 20k inbound links with same anchor text to homepage. Site has a total of 24k inbound links. July-Sep 2018 - total number of US ranking keywords halved November 10 - I walked into this mess. What's been done: Reduced site load speed by over 150% (it was around 20 seconds). Create sitemap (100 entry batching) and submit to GSC. Improved MarketMuse score for the homepage. Changed language from "en-US" to "en-AU" Fetch and render - content is all crawlable and indexed properly. Changed site architecture for top nav core landing pages to establish clear hierarchy. All version of GSC created, non-www and www http, and non www https and www https Site crawl - normal amount of 404s, nothing stands out as substantial. http to https redirect okay. Robots.txt updated and okay. Checked GSC international targeting, confirmed AU. No manual links penalty I'm clearly stumped and could use some insights. Thanks to everyone in advance, if you can find time.

    | Brafton-Marketing
    0

  • Hi All, I've got an issue with multi-region contact pages. For example, Google favors the UAE other region contact pages for French region searches, when I only want /fr/contact. I've used a Rel-con and set up the website to be pointing to the correct regions.

    | WattbikeSEO
    0

  • Hey, We recently launched a US version of UK based ecommerce website on the us.example.com subdomain. Both websites are on Shopify so canonical tags are handled automatically and we have implemented Hreflang tags across both websites. Suddenly our rankings in the UK have dropped and after looking in search console for the UK site ive found that a lot of pages are now no longer indexed in Google because the User-declared canonical is the Hreflang tag for the US URL. Below is an example https://www.example.com/products/pac-man-arcade-cabinet - is the product page is the canonical tag rel="alternate" href="https://www.example.com/products/pac-man-arcade-cabinet" hreflang="en-gb" /> - UK hreflang tag rel="alternate" href="https://us.example.com/products/pac-man-arcade-cabinet" hreflang="en-us" /> - US Hreflang tag then in Google search console the user-defined canonical is https://us.example.com/products/pac-man-arcade-cabinet but it should be https://www.example.com/products/pac-man-arcade-cabinet The UK website has been assigned to target the United Kingdom in Search Console and the US website has been assigned to target the United States. We also do not have access to robots.txt file unfortunately. Any help or insight would be greatly appreciated.

    | PeterRubber
    0

  • I see in a forum posting from 2016 that Squarespace had issues with adding custom code via body tags, and am trying to troubleshoot some schema I've added via GTM using JSON-LD and Yoast's converter tool to a Squarespace website. Is the general consensus to still add JSON-LD script directly into the head? And if so, where?

    | ogiovetti
    1

  • Hello, I have a broken plugin creating hundreds of WP-Content directory pages being indexed by Google. I can not access the source code of these pages to add a noindex to them. The page URL's all have the plugin name within them.  In order to resolve the issue, I wrote a solution with javascript to dynamically add in a noindex tag to any URL containing the plugin name. Would this noindex be respected by Google and is there a way to immediately check that it is respected? Currently, I can not delete the plugin due to issues with it's php. If you would like to view the code: https://codepen.io/trodrick/pen/Gwwaej?editors=0010 Thanks!

    | Tom3_15
    0

  • Google Search Console is showing some pages up as "Submitted URL has crawl issue" but they look fine to me. I have set them as fixed but after a month they were finally re-crawled and google states the issue persists. Examples are: https://www.rscpp.co.uk/counselling/175809/psychology-alcester-lanes-end.html
    https://www.rscpp.co.uk/browse/location-index/889/index-of-therapy-in-hanger-lane.html
    https://www.rscpp.co.uk/counselling/274646/psychology-waltham-forest-sexual-problems.html There's also some "Submitted URL seems to be a Soft 404": https://www.rscpp.co.uk/counselling/112585/counselling-moseley-depression.html I also have more which are "pending", but again I couldn't see a problem with them in the first place. I'm at a bit of a loss as to what to do next. Any advice? Thanks in advance.  

    | TommyNewmanCEO
    0

  • Sometime between Nov 8th and Nov 11th are rankings on Google for images dropped substantially. Our web rankings are holding OK. any idea!

    | OCFurniture
    0

  • I have remodeled an old html site using wordpress. I see some instructions in wordpress that says I can add an .html extension to some of the pages, but it looks pretty complicated. Is there any benefit in going through that hassle? or should I just ask my web guy to rewrite via htaccess | https://sacramentotop10.com/Weddings/Dresses.html | https://sacramentotop10.com/Weddings/Dresses.html becomes https://sacramentotop10.com/weddings/dresses

    | julie-getonthemap
    0

  • Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4

    | Extima-Christian
    0

  • Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!

    | Tom3_15
    0

  • Hi Folks, Need guidance about using PWA on desktop site. As I know PWA is basically used for mobile site to engage visitor more and let them surf your site like an app. Would it be good SEO practice to use PWA on desktop site(E-commerce site) by calling everything through Javascript and let google Crawler cache only site logo and Hide everything else?

    | Rajesh.Prajapati
    1

  • Hello, Several tools I'm using are returning errors due to "broken canonical links". However, I'm not too sure why is that. Eg.
    Page URL: domain.com/page.html?xxxx
    Canonical link URL: domain.com/page.html
    Returns an error. Any idea why? Am I doing it wrong? Thanks,
    G

    | GhillC
    1

  • I have a large body of product support documentation and there are similar pages for each of versions of the product, with minor changes as the product changes. The two oldest versions of this documentation get the best ranking and are powering Google snippets--however, this content is out of date. The team responsible for the support documentation wants current pages to rank higher. I suggested 301 redirects but they want to maintain the old page content for clients still using the older version of the product. Is there a way to move a page's power to a more updated version of the page, but without wiping out the old content? Considering recommending canonical tags, but I'm not sure this will get me all the way there either as there are some differences between pages, especially as the product has changed over time. Thoughts?

    | rachelholdgrafer
    0

  • Hi, I’m looking at a site at the moment that has a lot of products. For some of their category pages they have a ‘View All’ feature available. The URL uses this structure: domain.com/category/sub-category/product domain.com/category/sub-category/view-all < currently noindex applied Should the view all page be available for indexing? The individual sub-categories and products are indexable My immediate reaction is no, so long as the individual sub-cats are?

    | daniel-brooks
    0

  • I'd like some expert guidance. I've searched for a theme that does what I want and finally found something I like, but I'm wondering what you all think I should do to increase it's searchability. The plugin has all the listings and styling. All I need to do is past the code into the wordpress site ad voila! I have a page. Using the widget lets me allow upvotes and provide map etc. But it means the content is inside the widget instead of on the page. What would you modify if you wanted to keep the theme & widget to get the best results. http://best-of-sacramento.com/dentists This is my staging site.

    | julie-getonthemap
    1

  • Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
    Jens

    | WeAreDigital_BE
    0

  • Since most UTM codes/URLs are longer than 70ish characters, is this hurting my SEO? If it is, how can I solve the problem while still using a UTM code? Thanks!

    | Cassie_Ransom
    0

  • I've searched around quite a bit for a solution here, but I can't find anything. I apologize if this is too technical for the forum. I have a Wordpress site hosted on Nginx by WP Engine. Currently it resolves requests to URLs either with or without a trailing slash. So, both of these URLs are functional: <code>mysite.com/single-post</code> and <code>mysite.com/single-post/</code> I would like to remove the trailing slash from all posts, forcing mysite.com/single-post/ to redirect to mysite.com/single-post. I created a redirect rule on the server: ^/(.*)/$ -> /$1 and this worked well for end-users, but rendered the admin panel inaccessible. Somewhere, Wordpress is adding a trailing slash back on to the URL mysite.com/wp-admin, resulting in a redirect loop. I can't see anything obvious in .htaccess. Where is this rule adding a trailing slash to 'wp-admin' established? Thanks very much

    | james-tb
    0

  • When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’.  The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests.  It seems like a simple effective solution.  Are we missing something?

    | CREW-MARKETING
    1

  • Quick question. As an example, a client has a site named  site.com and then changed their business name and business url to site2.com I was informed that the site.com has a manual action (Pages affected by manual actions can see reduced display features, lower ranking, or even removal from Google Search results.) The page obviously is not active as it 301's to site2.com. Site.com has a 301 to site2.com. My questions? Does 301 transfer link power? So if a site is penalized, will the links that 301 cause problems to the new site2.com? I've requested them to remove the 301. Will, that fix it? Do we need to request a review? Thank you

    | Kdruckenbrod
    0

  • Hi Guys I previously asked a question that was helpfully answered on this forum, but I have just one last question to ask. I'm migrating a site tomorrow from http to https. My one question is that it was mentioned that I may need to "add canonical tags to the http pages, pointing to their https equivalent prior to putting the server level redirect in place. This is to ensure that you won't be causing yourself issues if the redirect fails for any reason." This is an e-commerce site with a number of links, is there a quick way of doing this? Many Thanks

    | ruislip18
    0

  • Hello All, Our website currently has a general solutions subdirectory, which then links to each specific solution, following the path /solutions/ => /solutions/solution1/. As our solutions can be quite complex, we are adding another subdirectory to target individuals by profession. I would like to link from our profession pages to the varying solutions that help. As both subdirectories will be top level pages in the main menu, would linking from our professions to **solutions **be poor architecture? In this case the path would look like: /professions/ => /professions/profession1/ => /solutions/solution1/. Thanks!

    | Tom3_15
    0

  • In regards to this article, https://moz.rankious.com/_moz/blog/how-to-get-search-console-data-api-python I've gotten all the way to the part where I need to authenticate the script to run. I give access to GSC and the local host code comes up. In the article, it says to grab the portion between = and #, but that doesnt seem to be the case anymore. This is what comes up in the browser http://localhost/?code=4/igAqIfNQFWkpKyK6c0im0Eop9soZiztnftEcorzcr3vOnad6iyhdo3DnDT1-3YFtvoG3BgHko4n1adndpLqjXEE&scope=https://www.googleapis.com/auth/webmasters.readonly When I put portions of it in, it always comes back with an error. Help!

    | Cnvrt
    0

  • My site prettycool.co.uk and primary we sell fascinators, the problem is I can't get the word fascinators to be listed by Google. We are on the 1st page for most colours ie. pink fascinators, blue fascinators etc. but for the term fascinators even if we fetch we are listed for a couple of hours and then disappear. I've checked for keyword stuffing but our site sell fascinators and we need to have this word in our site and other sites have a lot more references to the term and are listed on the 1st or 2nd pages. We used to be listed on page 1 for many years but the last 2 or 3 years dropped back to page 4 but now nothing.  Any help or suggestions would be fantastic!

    | Rutts
    0

  • Hi All I'm planning a http to https migration for a site with over 500 pages. The site content and structure will be staying the same, this is simply a https migration. Can I just confirm the answer to this fundamental question? From my reading, I do not need to create 301 redirect for each and every page, but can add a single generic redirect so that all http references are redirected to https. Can I just double check this would suffice to preserve existing google rankings? Many Thanks

    | ruislip18
    0

  • We had a spam injection a few months ago.  We successfully cleaned up the site and resubmitted to google.  I recently received a notification showing a spike in 404 errors. All of the URLS have a common word at the beginning injected via the spam: sitename.com/mono
    sitename.com/mono.php?buy-good-essays
    sitename.com/mono.php?professional-paper-writer There's about 100 total URLS with the same syntax with the word "mono" in them.  Based on my research, it seems that it would be best to serve a 410.  I wanted to know what the line of HTACCESS code would be to do that in bulk for any URL that has the word "mono" after the sitename.com/

    | vikasnwu
    0

  • we implemented our website on the basis of WordPress, then we migrate our website to PHP (YII Framework). after a while, we found out an issue around internal links which they were increasing severely. when we check our landing pages in webmaster (for example Price list), in contains 300 internal links but the reality is that there are no href tags on this page. it seems that webmaster calculate most of our links with the links of a single page and show them to us. is it natural or a mis configuration has been happened? Yh1NzPl

    | jacelyn_wiren
    0

  • I have just discovered that the WordPress theme I have been using for some time has no follow internal links on the blog. Simply put each post has an image and text link plus a 'read more'. The Read more is a no-follow which is also on my homepage. The developer is saying duplicate follow links are worse than an internal no follow. What is your opinion on this? Should I spend time removing the no follow?

    | Libra_Photographic
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.