Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey there, A client rents all kinds of party articles, like plates, bowles, etc. Currently, al his article pages have canonicals to their parent category pages, supposedly to have any pagevalue flow to these category pages, (which are much more relevant for SEO). Is there anyone who agrees with this method? I think a noindex,follow would be a better measure to prevent Google from accessing all these 'low value' article pages. Besides, a canonical should indicate that page A and B are (almost) identical, which they most certainly are not in this case. What are your thoughts?

    | Adriaan.Multiply
    0

  • I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.

    | moge
    0

  • If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?

    | Gabriele_Layoutweb
    0

  • Hi all, I have a client who wants to rank more prominently for "plastic surgeon jupiter fl", a key term in his niche that attracts 11-50 searches per month (but these are potentially big ticket clients). If you look at the first page of results for that term, I can't make any sense of them. I've checked page speed, Google listing optimization, on-page SEO, link metrics etc. and there seems to be no correlation with good on-page SEO, quality links (or volume of links). Any thoughts?? I literally cannot explain why the #1 site shows 2 inbound links via Moz OSE and almost no on-page SEO to speak of while sites ranking page 2 have better on-page SEO, more links, higher quality links (from what I can tell) etc.

    | RickyShockley
    0

  • Hi Nearly 2 years ago our site was moved from a .co.uk domain to .media. Because this TLD isn't supported for DA, it's hard to measure whether linkbuilding campaigns (for example) are having a positive effect. The old site has a DA of 38 (even after 2 years of inactivity) and the new one is 1, but the new one has better Trust and Citation flow, for example. I'm now investigating whether it's worth moving back to .co.uk and I want to fully understand the risks involved. So far I know of the following potential risks: It's a lot of work so human error is a real risk Could create a redirect loop as the old site has 301 redirects in place to the new one It will take several months for metrics to recover Any thoughts on more risks, how these challenges can be overcome etc will be welcome. Or do I just set fire to the lot and create a new site with yet another 301 redirect from the .media site? What would you do?

    | AxonnMedia
    0

  • Our company sells a product system that will permanently waterproof almost anything. We market it as a DIY system. I am working on SEO titles and descriptions. This topic came up for discussion, if using "SAVE $1000's.." would help or hurt. We are trying to create an effective call to action, but we are wondering if search engines see it as click bait. Can you

    | tyler.louth
    0

  • Hi there, We have a client with a large eCommerce site with about 1500 duplicate URLs caused by the parameters in the URLs (such as the sort parameter where the list of products are then sorted by price, age etc.) Example: www.example.com/cars/toyota First duplicate URL: www.example.com/cars/toyota?sort=price-ascending Second duplicate URL: www.example.com/cars/toyota?sort=price-descending Third duplicate URL: www.example.com/cars/toyota?sort=age-descending Originally we had advised to add a robots.txt file to block search engines from crawling the URLs with parameters but this hasn't been done. My question: If we add the robots.txt now and exclude all URLs with filters - how long will it take for Google to disregard the duplicate URLs? We could ask the developers to add canonical tags to all the duplicates but these are about 1500... Thanks in advance for any advice!

    | Gabriele_Layoutweb
    0

  • Good morning, I have a question about how to use Schema.org on a hotel website.  Since a website will have many pages, do I add the microdata for the hotel's address, contact info, ratings, reviews, etc on every page as that information is in the footer of each page or do I just add to the homepage once? Thanks in advance

    | mulch
    0

  • I run a New York City commercial real estate in New York City. Lately, I have started to produce 30-second videos about property listings and neighborhoods. I have noticed that the engagement for these videos on Facebook is much higher that for text posts. Should adding these videos on our website (hosting them on Youtube) result in increased visitor engagement? Could there be a positive SEO effect such as more links and higher quality links? Anyone have any experience with this? Thanks, Alan

    | Kingalan1
    0

  • We are looking for the very best way of handling potentially thousands (50k+) of 301 redirects following
    a major site replacement and I mean total replacement. Things you should know
    Existing domain has 17 years history with Google but rankings have suffered over the past year and yes we know why. (and the bitch is we paid a good sized SEO company for that ineffective and destructive work)
    The URL structure of the new site is completely different and SEO friendly URL's rule. This means that there will be many thousands of historical URL's (mainly dynamic ones) that will attract 404 errors as they will not exist anymore. Most are product profile pages and the God Google has indexed them all. There are also many links to them out there.
    The new site is fully SEO optimised and is passing all tests so far - however there is a way to go yet. So here are my thoughts on the possible ways of meeting our need,
    1: Create 301 redirects for each an every page in the .htaccess file that would be one huge .htaccess file 50,000 lines plus - I am worried about effect on site speed.
    2: Create 301 redirects for each and every unused folder, and wildcard the file names, this would be a single redirect for each file in each folder to a single redirect page
    so the 404 issue is overcome but the user doesn't open the precise page they are after.
    3: Write some code to create a hard copy 301 index.php file for each and every folder that is to be replaced.
    4: Write code to create a hard copy 301 .php file for each and every page that is to be replaced.
    5: We could just let the pages all die and list them with Google to advise of their death.
    6: We could have the redirect managed by a database rather than .htaccess or single redirect files. Probably the most challenging thing will be to load the data in the first place, but I assume this could be done programatically - especially if the new URL can be inferred from the old. Many be I am missing another, simpler approach - please discuss

    | GeezerG
    0

  • Are low text-HTML ratios still a negative SEO ranking factor? Today I ran SEMRUSH site audit that showed 344 out of 345 pages on our website (www.nyc-officespace-leader.com) show an text-HTML ratio that ranges from 8% to 22%.  This is characterized as a warning on SEMRUSH. This error did not exist in April when the last SEMRUSH audit was conducted. Is it worthwhile to try to externalize code in order to improve this ratio? Or to add  text (major project on a site of this size)? These pages generally have 200-400 words of text. Certain URLs, for example www.nyc-officespace-leader.com/blog/nycofficespaceforlease more text, yet it still shows an text-HTML ratio of only 16%. We recently upgraded to the WordPress 4.2.1. Could this have bloated the code (CSS etcetera)  to the detriment of the text-HTML ratio? If Google has become accustomed to more complex code, is this a ratio that I can ignore. Thanks, Alan

    | Kingalan1
    0

  • Hey Mozzers, long time no post. Just a quick one for you regarding URLS, this is an example of a url on a site https://www.thisismyurl.co.uk/products/spacehoppers/special-spacehopper.html Many of these pages are getting flagged for having a url that is too long. The target of this page is "special spacehoppers". Should i be concerned with the url being to long given my keyword is at the end? Would this be a suitable idea? https://www.thisismyurl.co.uk/p/spacehoppers/special.html Would changing products to p be worthwhile? It would remove length from nearly all urls but would require a site wide re-direct. 2)Would removing the "spacehoppers" bit from the url be worth it? Yes it would shorten the url but would also remove the exact keyword from the url which could be detrimental to rankings.

    | ATP
    0

  • Hi all, I'm looking to implement sitelink search box mark-up in Google Tag Manager in JSON-LD format. This would be popped into the Custom HTML tag and would look a little something like: The above option is great if you have one query string for your search term, but what if you had a URL that triggered two query strings - for example: https://www.example.com/search?q=searchterm&category=all Would you need to amend the code something like the below: Any help would be much appreciated! Cheers, Sean

    | seanginnaw
    0

  • Hi Has anyone had experience of updating their text to code ratio if its too high & whether this has much impact on SEO performance? I am trying to prioritise tasks & wondered if this is something which should be higher on my list. Thank you 🙂

    | BeckyKey
    0

  • I'm doing some competitor analysis for a client. I'm looking at the client's title tags and meta descriptions for specific search results, in comparison to their main competitor. I'm trying to establish if the client is ranking higher due to better relevance, or just because they have higher PA and DA. It appears to be the latter. Observations: For both the client and their competitor, their home pages appear in the results much more frequently than specific landing pages The meta description Google chooses to display in the search results for the home page does not always match the ACTUAL meta description for the page and appears to vary depending on the specific search query Questions: Does Google create meta descriptions on the fly? Is this an example of Google using semantic search? And if so, why are we bothering to type customised meta descriptions for specific pages, if Google is just going to recreate them anyway? Is Google displaying results of the home pages simply because they cannot find pages more relevant (ie. if we produced landing pages more relevant to these specific search queries, would Google rank them higher)?

    | muzzmoz
    0

  • I have a client who has just seen his average page rank creep up from around 39 to 34 over about two months, then it appears to have dropped back to position 40+ in the space of a week. I believe he's made a lot of changes to targeted keywords, so I'd like to think it's simply because his old targeted keywords are dropping and new keywords still have to build their rankings. But I'm also worried in case he has over-optimised and might get getting penalised. Any advice on where to start digging?

    | muzzmoz
    0

  • Our company has a paid subscription-based site that only allows us to add HTML in the WYSIWYG section, not in the backend of each individual page. Because we are an e-commerce site, we have many duplicate page issues. Is there a way for us to add or hide the canonical code in the WYSIWYG section instead of us having to make all of our pages significantly different?

    | expobranders
    0

  • Website: www.wheelchairparts.com
    Keyword: wheelchair parts My website is #1 or #2 on almost every search engine besides Google. Google has us bouncing between the bottom of page 2 and top of 3. However we are on page one for "wheelchairparts". I need to get a link building campaign going for this site. I feel it's more difficult for ecommerce websites and nothing seems to fit in with Rand's Mozcon 2016 Link Building talk except hacks. I need to find a flywheel. Either way, my question is what can I do other than link building to get on page 1 of Google for the term "wheelchair parts"? Thanks in advance! - Mike Bean

    | Mike.Bean
    1

  • Let's say there is a website(domain) and couple of sub-domains (around 6). If we optimise all sub-domains with "keyword" we want our website to rank for.....like giving "keyword" across all page titles of sub-domains and possible places which looks natural as brand mentions. Will this scenario helps website to rank better for same "keyword"? How can these sub-domains do really influence website in rankings? Like if the sub-domains have broken links, will this affect website SEO efforts?

    | vtmoz
    0

  • I have a site in English hosted under .com with English info, and then different versions of the site under subdirectories (/de/, /es/, etc.) Due to budget constraints we have only managed to translate the most important info of our product pages for the local domains. We feel however that displaying (on a clearly identified tab) the detailed product info in English may be of use for many users that can actually understand English, and may help us get more conversions to have that info. The problem is that this detailed product info is already used on the equivalent English page as well. This basically means 2 things: We are mixing languages on pages We have around 50% of duplicate content of these pages What do you think that the SEO implications of this are? By the way, proper Meta Titles and Meta Descriptions as well as implementation of href lang tag are in place.

    | lauraseo
    0

  • Even I search for exact h1 tag heading from our homepage, it's (homepage) not been showing up on TOP of the results. Other websites with partial match of search query are ranking above us; why this is happening? And other website with same text as normal paragraph is ranking on top. But not out h1 tag from homepage? How come normal text of unrelated website is ranking above h1 heading from homepage of own website?

    | vtmoz
    0

  • When we cannot rank for multiple keywords; can we try creating landing pages for some long-tail keywords and put all such landing pages at footer menu to rank for search queries? Will this helps and sound spammy to Google?

    | vtmoz
    0

  • Hi, Need some help. 
    So this is my website _https://www.memeraki.com/  _
    If you hover over any of the products, there's a quick view option..that opens up a popup window of that product
    That popup is triggered by this URL.  _https://www.memeraki.com/products/never-alone?view=quick  _
    In the URL you can see the parameters "view=quick" which is infact responsible for the pop-up. The problem is that the google and even your Moz crawler is picking up this URL as a separate webpage, hence, resulting in crawl issues, like missing tags.
    I've already used the webmaster tools to block the "view" parameter URLs in my website from indexing but it's not fixing the issue
    Can someone please provide some insights as to how I can fix this?

    | ImranZafar
    0

  • I have a page which is ranking already pretty well for a relative competitive keyword. 
    Google also ranks us on first page for synonym of keyword we optimize the page for (even though synonym does not appear on our page). I am now considering to replace some occurences of the keyword in the page by different synonyms, in the hope that our ranking may further improve for these synonyms. 
    However I am concerned that google may penalize me for keyword stuffing if I am using a wide range of synonyms of one keyword on our page. My plan is only to replace some occurences of keyword with synonyms. I am a bit nerveous here since page is already ranking quite well in a competitive niche. Any thoughts?

    | lcourse
    0

  • What is the best way to use articles from a "thought leader" to build high-quality links to my website? I have heard that it is possible to pay bloggers to post business articles that link back to a website. That assuming these blogs have domain authority this is a good technique to improve ranking. Is this in fact true, and if so where would I find blogs to post our content. The purpose would be to promote real estate brokerage website. Any suggestions? Is this possible, advisable, best use of quality content? Alternatively, where else can we post engaging content to create links back to our site? Social media? The nature of the content would be such topics as how to find the best value in Manhattan office of loft space rentals, etcera. Thanks, Alan

    | Kingalan1
    0

  • Currently we have direct links to the top 100 country and  city landing pages on our index page of the root domain. 
    I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country. 
    I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page. 
    I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?

    | lcourse
    0

  • Hello everyone, I am working to create sub-category pages on our website virtualsheetmusic.com, and I'd like to have your thoughts on using a combination of images and text as anchor text in order to maximize keyword relevancy. Here is an example (I'll keep it simple): Let's take our violin sheet music main category page located at /violin/, which includes the following sub-categories: Christmas Classical Traditional So, the idea is to list the above sub-categories as links on the main violin sheet music page, and if we had to use simple text links, that would be something like: Christmas
    Classical
    Traditional Now, since what we really would like to target are keywords like: "christmas violin sheet music" "classical violin sheet music" "traditional violin sheet music" I would be tempted to make the above links as follows: Christmas violin sheet music
    Classical violin sheet music
    Traditional violin sheet music But I am sure that would be too much overwhelming for the users, even if the best CSS design were applied to it. So, my idea would be to combine images with text, in a way to put those long-tail keywords inside the image ALT tag, so to have links like these: Christmas
    Classical
    Traditional That would allow a much easier way to work the UI , and at the same time keep relevancy for each link. I have seen some of our competitors doing that and they have top-notch results on the SEs. My questions are: 1. Do you see any negative effect of doing this kind of links from the SEO standpoint? 2. Would you suggest any better way to accomplish what I am trying to do? I am eager to know your thoughts about this. Thank you in advance to anyone!

    | fablau
    1

  • Hi Guys I am going to be putting some powerpoint presentations up over time.  I have a couple of questions regarding slideshare. If I add links to the slideshare are these crawl able by Google etc...? If I places the powepoint presentation on our website and slideshare would this be counter productive i.e duplicate content? Love to here your suggestions.

    | Cocoonfxmedia
    0

  • Hi Moz Community, I am working at a branding company. Currently, we found our ranking for main keywords (branding company) is dropping a little.  I am suspicious of the new blog posts that we published lately. We wrote different kind of stuff, such as company culture, how to inspire employees, the importance of leadership and some other stuff that is actually not as related to our main keyword - branding company. My question is, will these irrelevant posts hurt our ranking for main keyword, to some extent? Thank you. Best, Raymond

    | raymondlii
    0

  • Could unusually large number of links from Pinterest cause issues? Would Google categorise them as spammy links or site wide links? I have a small site with Urls around 800-1000. But webmaster shows 5321 links from Pinterest.com and 1467 from Pinterest.se. Please see attachment. ffNLF

    | riyaaaz
    0

  • If Website A is ranking 19th position in Google for a specific keyword, and Website B is ranking 30th position for the same keyword, What would be impact after 301 redirect? Will Website A drop to 30th position because of 301 or existing position would improve because of link juice?

    | riyaaaz
    0

  • Hi Moz Community, According to Google Search Console, the main keyword for our website is undergoing a low click through rate, even though we have good ranking for that keyword (top 3). Currently, our homepage's title tag is "Brand Name: Primary Keyword". I am thinking about adding a secondary keyword or other keyword variation to differentiate our company from others in order to possibly increase the click through rate. Will this affect the current ranking for the primary keyword? Also, is the clickthrough data in Google Search Console accurate? Thank you! Best, Raymond

    | raymondlii
    0

  • I have two websites with identical content. Haya and ethnic Both websites have similar products. I would like to get rid of ethniccode I have already started to de-index ethniccode. My question is, Will I get any SEO benefit or Will it be harmful if I 301 direct the below only URL’s https://www.ethniccode/salwar-kameez -> https://www.hayacreations/collections/salwar-kameez https://www.ethniccode/salwar-kameez/anarkali-suits - > https://www.hayacreations/collections/anarkali-suits

    | riyaaaz
    0

  • I'm migrating sub-domains to sub-folders, but this question is likely applicable for most URL migrations. For example: subdomain1.example.com to example.com/subdomain1 and any child pages. Bear with me as it may just be me but I'm having trouble understanding whether internal links (menu, contextual etc and potentially the sitemaps) should be kept as the pre-migration URL (with .301 in place to the new URL) to give Google a chance to process the redirects or if they should be updated straight away to the new URL to provide a 200 response as so many guides suggest. The reason I ask is unless Google specifically visits the old URL from their index (and therefore processes the .301), it's likely to be found by following internal links on the website or similar which if they're updated to reflect the new URL will return a 200. I would imagine that this would be treated as a new page, which is concerning as it would have a canonical pointing toward itself and the same content as the pre-migrated URL. Is this a problem? Do we need to allow proper processing of redirects for migrations or is Google smarter than this and can work it out if they visit the old URL at a later date and put two and two together? What happens in-between? I haven't seen any migration guides suggest leaving .301s in place but to amend links to 200 as soon as possible in all instances. One thought is I guess there's also the Fetch as Google tool within Search Console which could be used with the old URLs - could this be relied on? Apologies if this topic has been covered before but it's quite difficult to search for without returning generic topics around .301 redirects. Hope it makes sense - appreciate any responses!

    | AmyCatlow
    0

  • Hello, Since 2 weeks, our website is losing positions in Google. After years on the first page, we dropped for our main keyword to the 3rd page. Seems that all the positions we lost, were ranking with the homepage. Now, we are on the 3rd page but with a less important page. How is it possible that only the homepage disappeared? Is there any explanation for that? I hope there is an explanation, so we can fix the trouble. Kind regards, Tine

    | TineDL
    0

  • I am performing a site audit and looking at the "Index Status Report" in GSC. This shows a total of 17 URLs have been indexed. However when I look at the Sitemap report in GSC it shows 9,000 pages indexed. Also, when I perform a site: search on Google I get 24,000 results. Can anyone help me to explain these anomalies?

    | richdan
    0

  • We're working to improve the ranking of one of our product landing pages. The page that currently ranks #1 has a very simple, short layout with the main keyword many times on the page with otherwise very little text. One thought we had was to make a more comprehensive page including more info on the features and benefits of the product. The thought being that a longer form page would be more valuable and potentially look better to Google if the other SEO pieces are on par. Does that make sense to do? Or would it be better to keep the product page simple and make some more related content on our blog linking back to that landing page? Thanks in advance to any help you can provide!

    | Bob_Kastner
    0

  • I was exploring my company's visibility in Google News results, and I noticed the author byline in a recently published article was being pulled into the page title in the SERP. See the attached image for a screenshot. It makes it sound awkward: "How to Find the Best Cannabis Experience and High for You Patrick..." - as if we're explaining it to some guy named Patrick? We have the byline the same way in all other posts, but this is the first I've seen this happen. Has anyone seen/had this happen, and if so, have any ways to prevent it? Thanks in advance for any insights! Here's the post URL: https://www.leafly.com/news/cannabis-101/how-to-find-best-cannabis-experience-high csvmF

    | davidkaralisjr
    0

  • I'm planning on using 301 redirects to spin out a subdirectory of my current website to be its own separate domain. For instance, I currently have a website www.website.com and my writers write tech news at www.website.com/news. Now I want to 301 redirect www.website.com/news to www.technews.com. Will this have any negative impact on SEO? What are some steps that I can take to minimize these impacts?

    | Chris_Bishop
    1

  • Hi I've always used schema mark up in previous companies, however product pages on the site I'm currently working on are in data vocab. I cant find much about data vocab, so is it best to move away from this? I need to update the mark up on our product pages - so want the best mark up for this Thank you

    | BeckyKey
    0

  • As we are a classified ads site, our ads expire after some time,and we redirect 301 the ad post page to the parent category And images urls in the ad page is redirected to, so they are not getting index in google image..what is the best solution for getting image index in this situation: 301 redirect images Keep images And so more?

    | divar
    0

  • It happend to our website. We have seen major ranking fluctuations for our website because of one back-link. What kind of links those can be? Why Google is not stopping them even though they claim that such back-links will be taken care of?

    | vtmoz
    0

  • We have a website we are working on that was ranking well in Google but since having a hosting upgrade has completely dropped in rankings. When a hosting upgrade was made, the developer added an incorrect robots.txt file that restricted the site from being found, hence resulting in lost rankings. We have since sorted out that issue so the robots.txt is OK. However, ranking results have yet to be reclaimed. We are unsure why these rankings haven't rebounded back, as it has been a while now. The site is https://www.brightonpanelworks.com.au. We have since also attempted to add a sitemap however to help the site be better crawled and to regain rankings, however, it appears that sitemap generators are having problems creating a sitemap for this site and we are not sure why. And we are not sure whether this may relate to why Google has not picked up on pages and ranking results have not be restored. If you have any ideas as to how we can reclaim rankings to the strong positions they were  in previously, that would be much appreciated. We believe we may be missing something here that is not allowing webpages to be picked up and ranked by Google.

    | Gavo
    0

  • Hi Guys. I am running a cleanup for the on page schema we use and will be moving the on page elements into tag manager. I have all the metas and schema for the products boxed off. My question today is what schema should I use for category pages. Granted there is Json-LD for aggregated reviews but I cant see or work out how or what to use for the category pages that have the lists of products on. Any assistance appreciated. Alex

    | JBGlobalSEO
    1

  • I typically see browsers refresh at 48 hours the longest. We pushed some changes through production about a week ago and Chrome still has the old version cached. I'm seeing some similar posts and wonder if Google is up to something and we are starting to "cache" on (pun intended)?

    | emilydavidson
    0

  • I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
    http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter

    | nystromandy
    0

  • Looking for some advice 🙂 I have a domain that has been registered since 1999 and currently hosts my website - the problem is that my business has moved in a different direction and my URL is no longer associated with my main product offering. For example in the past I was xyzgarden.com however now something like xyzhomedecor.com is much more appropriate. How should I handle this so that I am not at a disadvantage for SEO. thanks!

    | MainstreamMktg
    0

  • Hello all, I'm looking at something a bit wonky for one of the websites I manage. It's similar enough to other websites I manage (built on a template) that I'm surprised to see this issue occurring. The xml sitemap submitted shows Google there are 229 pages on the site. Starting in the beginning of December Google really ramped up their intensity in crawling the site. At its high point Google crawled 13,359 pages in a single day. I mentioned I manage other similar sites - this is a very unusual spike. There are no resources like infinite scroll that auto generates content and would cause Google some grief. So follow up questions to my "why?" is "how is this affecting my SEO efforts?" and "what do I do about it?". I've never encountered this before, but I think limiting my crawl budget would be treating the symptom instead of finding the cure. Any advice is appreciated. Thanks! *edited for grammar.

    | brettmandoes
    0

  • Hi Guys Just seeking some advise.  We know Google is very keen about site speed and the one of the best ways to manage this is to cache images use CDNs etc.. However what I am finding is that we have rapid site speed but any new updates take a few refreshes or we have to wait for the ISP to clear their DNS for the updates to show. I have put the meta tag for non-caching and on cPanel I have developer mode active on the caching sessions which in theory will not store anything in the cache for 6 hours. Does anyone know of anything else which can force a wordpress site on an update or image/post/page or datasbase for the browser to be flushed? I think possibly this will only be good as other users browsers may have similar issues.

    | Cocoonfxmedia
    0

  • Hello, We are working on migrating a website to a new web server.  In addition to the primary website domain, there are several other variations that are owned.  Is okay if we point all of our domains to the same IP address as our primary domain, and then setup 301 redirects to the primary domain?  Are there any risks in doing this? There may be about 100 domains.  Many of them are different country TLD for same primary .com domain, others including misspellings of primary .com, and some that are not so related to primary domain. Thank you in advance for your response!

    | srbello
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.