Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • So you've been MOZing and SEOing for years and we're all convinced of the 10x factor when it comes to content and ranking for certain search terms... right? So what do you do when some older sites that don't even produce content dominate the first page of a very important search term? They're home pages with very little content and have clearly all dabbled in pre Panda SEO. Surely people are still seeing this and wondering why?

    | wearehappymedia
    0

  • Hi - I'm assuming image URL best practice follows same principles as non image URLs (not too many files and so on) - I notice alot of web devs putting photos in subdomains, so wonder if I'm missing something (I usually avoid subdomains like the plague)!

    | McTaggart
    1

  • I am working on the website and feel like we have a lot more content (original) also all of it is above the fold, however I don't seem to find the website ranked higher compared to other sites with similar keywords. My URL is: http://www.cypressindustries.com/Any suggestion for improvement or which areas I should be focusing?

    | HasitR
    0

  • Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.

    | _nitman
    0

  • One of my clients is wondering whether they should move they stand alone business website to a subfolder of their brand website. For example, from http://www.johnlewisforbusiness.com to http://www.johnlewis.com/business. Do you guys think it's a good idea from SEO point of view? Can you recommend any articles on this? What is the expected loss of current value by changing domain and migrating URLs?

    | Adido-105399
    0

  • Getting links from popular website using real name or brand name as comment still works? I know that anchor tags in name while commenting on other sites would directly go to spam.

    | welcomecure
    0

  • I know that we should get the reputable backlinks for the website But which websites I'm doing seo nobody wants to exchange backlinks so how can I get reputable backlinks related to niche.

    | ramansaab
    0

  • Hi guys, I have duplicate category pages across a ecommerce site. http://s30.postimg.org/dk9avaij5/screenshot_160.jpg For the currency based pages i was wondering would it be best (or easier) to exclude them in the robots.txt or use a rel canonical? If using the robots.txt (would be much easier to implement then rel canonical) to exclude the currency versions from being indexed what would the correct exclusion be? Would it look something like: Disallow: */?currency/ Google is indexing the currency based pages also: http://s4.postimg.org/hjgggq1tp/screenshot_161.jpg Cheers,
    Chris

    | jayoliverwright
    0

  • Hi Moz community I have started receiving a load of 404 errors that look like this: This page: http://paulminors.com/blog/page/5/ is linking to: http://paulminors.com/category/podcast/paulminors.com which is a broken link. This is happening with a load of other pages as well. It seems that "paulminors.com" is being added to the end of the linking pages URL.I'm using Wordpress and the SEO by Yoast plugin. I have searched for this link in the source of the linking page but can't find it, so I'm struggling to diagnose the problem. Does anyone have any ideas on what could be causing this? Thanks in advance Paul

    | kevinliao
    0

  • HI guys, Just wondering if anyone knows of any tools to scan a site for duplicate content (with other sites on the web). Looking to quickly identify product pages containing duplicate content/duplicate product descriptions for E-commerce based websites. I know copy scape can which can check up to 10,000 pages in a single operation with Batch Search. But just wondering if there is anything else on the market i should consider looking at? Cheers, Chris

    | jayoliverwright
    0

  • I've got a new client that just fired their former SEO company, which was building spammy links like crazy! Using GSC and Majestic, I've identified 341 linking domains. I'm only a quarter of the way through the list, but it is clear that the overwhelming majority are from directories, article directories and comment spam. So far less than 20% are definitely links I want to keep. At what point do I keep directory links? I see one with a DA of 61 and a Moz spam score of 0. I realize this is a judgement call that will vary, but I'd love to hear some folks give DA and spam numbers. FWIW, the client's DA is 37.

    | rich.owings
    0

  • Hi we just did a massive deepcrawl (using the tool deepcrawl.co.uk/) on the site: http://tinyurl.com/nu6ww4z http://i.imgur.com/vGmCdHK.jpg Which reported a lot of URLs as either 508 and 500 errors. For the URLs as reported as either 508 or 500 after the deep crawl crawl finished we put them directly into screaming frog and they all came back with status code 200. Could it be because Deep Crawl hammered the site and the server couldn't handle the load or something? Cheers, Chris

    | jayoliverwright
    0

  • I have 2 websites. One website links are from spamy techniques (wrong guy hired) which still has massive links so I started a new website with a fresh domain. Now when the new website (only white hate methods used) has started to show positive movements I feel like its the right time to shut the other website down. Since, I have a lot of content on my first site (spamy links) can i reuse the content again on my new site after I shut down my first site?

    | welcomecure
    0

  • I am working on a site which has good amount of content pages & blogposts. All blog posts & content pages have been indexed in google already since a long time. Will this be a good practice if I internally link to new pages from old pages or a few important pages from those old indexed content pages?

    | welcomecure
    0

  • I've created multiple high quality Google Hangout videos (now stored as YouTube videos) with a client. Does it make sense to download these videos and re-post to third party sources like Vimeo, DailyMotion,etc. or is this considered duplicative content and no additional G value will apply? I know I have some excellent content in these videos and would like to hear from someone with experience on promoting raw video footage, outside of the YouTube format.  Have you had success? Thanks!

    | mgordon
    0

  • My companies is based in the US, but our customer base is 50% international.  The majority of our international customers are from english speaking countries like the UK, AU, NZ, etc.  We currently rank well for 2 of our industries core keywords in the US, but are not even on the radar in the UK or AU. I do generate international backlinks, although not as much as the US backlinks (approximately 25% intl, 75% US).  Should I purchase localized urls like .co.uk or .com.au and point those at my .com?  Any guidance the community could provide would be greatly appreciated?

    | batchbook
    0

  • Hello guys,
    I have a question to those of you, who have migrated from HTTP to HTTPS. We are planning to migrate the site of our customer to Always SSL. In other words, we want to redirect all site pages to HTTPS, except for the blog. Currently, the whole site is using the HTTP protocol (except the checkout page).
    After the change, our customer's site should look like this: https://www.domain.com
    http://www.domain.com/blog/ The reasons we do not want to migrate the blog to HTTPS are as follows: The blog does not collect any sensitive user information, as opposed to the site. We all know that on-site algorithms like Panda are having sitewide effect. If the Panda doesn’t like part of the blog (if any thin or low quality content), we do not want this to reflect on the rankings of the entire website. Having in mind that for Google, HTTP and HTTPS are two different protocols, a possible blog penalty should not reflect the web site, which will use HTTPS. Point 2 is the reason I am writing here, as this is just a theory. I would like to hear more thoughts from the experts here. Also, I would like to know your opinion, regarding this mixed use of protocols – could this change lead to a negative effect for any of the properties and why? For me, there should be no negative effect at all. The only disadvantage is that we will have to monitor both metrics – the blog and the site separately in webmaster tools. Thank you all and looking forward for your comments.

    | newrankbg
    0

  • Hello all, I am building a website that lists homeschool events and field trips across various states (locker-time.com) and I have a few questions on setting it up correctly. Both the events and field trips are searchable by distance. For clarification, events are associated with a specific date and time and field trips are not. I currently have a link that says homeschool events and you enter your zip to find things close by. Is it better to create a separate page for each state I am targeting instead?  So the link would be homeschool events and then a sub-link that says homeschool events in GA and the GA page brings up all the events in GA, still searchable by zip.  Or does it matter? I was thinking if its a separate page, I could put keyword rich copy on top, but then clicking on the menu and choosing the appropriate sub-menu is an additional step for users on the site and as the number of states increase, that sub-menu could get pretty big. The search results pages lists the post title of any events or field trips found and the links go to a page on my website with more information, such as the location, details on the event / field trip and a link to their website. I am wondering for SEO purposes, is this the right way to do it? Or I could set up the results page to show an excerpt and some listing info and then link directly to their website.  Does it matter?  I was thinking a page on my own website since then I could add images (but that might end up sucking up all my hosting space). As I am adding these listings to my website, I simply copied/pasted the details on the event.  Now that I'm thinking about it, original content is best, so should I stop doing that and rewrite the description in my own words?  Since the events are date specific events and when they pass, they are no longer on the site, does it matter as much for the events? The field trips do not have dates associated with them, so I can probably work on creating my own descriptions for those. Just not sure if I should bother with events that are more short term. Thanks in advance for ANY advice or suggestions. I'm so looking forward to getting this all set up correctly! I find working on this SEO stuff such fun! Jeanette

    | fatcreat
    0

  • There are a number of blog articles on my site that have started receiving the "This site may be hacked" warning in the SERP. I went hunting for security issues in the Search Console, but it indicated that my site is clean. In fact, the average position of some of the articles has increased over the last few weeks while the warning has been in place. The problem sounds very similar to this thread: https://productforums.google.com/forum/#!category-topic/webmasters/malware--hacked-sites/wmG4vEcr_l0 but that thread hasn't been touched since February. I'm fearful that the Google Form is no longer monitored. What other steps should I take? One query where I see the warning is "Brand Saturation" and this is the page that has the warning: http://brolik.com/blog/should-you-strive-for-brand-saturation-in-your-marketing-plan/

    | Liggins
    0

  • Hi We moved to a new domain back in March 2014 and redirected most pages with a 301 and submitted change of domain request through Google Webmaster tools. A couple of pages were left as 302 redirect as they had rubbish links pointing to them and we had previously had a penalty. Google was still indexing the old domain and our rankings hadn't recovered. Last month we took away the 302 redirects and just did a blanket 301 approach from old domain to new in the the thinking that as the penalty had been lifted from the old domain there was no harm in sending everything to new domain. Again, we submitted the change of domain in webmaster tools as the option was available to us but its been a couple of weeks now and the old domain is still indexed Am I missing something? I realise that the rankings may not have recovered partly due to the disavowing / disregarding of several links but am concerned this may be contributing

    | Ham1979
    0

  • Which one is better Yoast or premium SEO pack excluding that one is paid & other is free. Because when we are doing SEO $39 doesn't make a huge difference as compared to the work we do right? Which one will you suggest out of the two?

    | welcomecure
    0

  • Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards

    | vivekrathore
    0

  • Hello I know a site which has the following things and it is not hit by google penguin at all and is ranked on first page first link in google.com.pk it has 1.3K total backlinks its has 34 referring domains according to ahrefs the exact match keyword which is ranked in google.com.pk as first position is used in 1.2 K backlinks its DA is 18 and PA is 31 its citation flow is 19 trust flow is 13 targeted exact match keyword is present in Title and description also. Now tell me what is happening here.

    | tanveerayakhan
    0

  • I have an e-commerce client that is migrating platforms. The current structure of their existing website has led to what I would believe to be mass duplicate content. They have something north of 150,000 indexed URLs. However, 143,000+ of these have query strings and the content is identical to pages without any query string. Even so, the site does pretty well from an organic stand point compared to many of its direct competitors. Here is my question: (1) I am assuming that I should go into WMT (Google/Bing) and tell both search engines to ignore query strings. (2) In a review of back links, it does appear that there is a mish mash of good incoming links both to the clean and the dirty URLs. Should I add a rel=canonical via a script to all the pages with query strings before we make our migration and allow the search engines some time to process? (3) I'm assuming I can continue to watch the indexation of the URLs, but should I also tell search engines to remove the URLs of the dirty URLs? (4) Should I do Fetch in WMT? And if so, what sequence should I do for 1-4. How long should I wait between doing the above and undertaking the migration?

    | ExploreConsulting
    0

  • Hi Guys, So basically have a site which has both HTTPs and HTTP versions of each page. We want to consolidate them due to potential duplicate content issues with the search engines. Most of the HTTP pages naturally have most of the links and more authority then the HTTPs pages since they have been around longer. E.g. the normal http hompage has 50 linking root domains while the https version has 5. So we are a bit concerned of adding a rel canonical tag & telling the search engines that the preferred page is the https page not the http page (where most of the link equity and social signals are). Could there potentially be a ranking loss if we do this, what would be best practice in this case? Thanks, Chris

    | jayoliverwright
    0

  • Hello, someone is asking me why we don't rank in google mexico search engine. I mentioned we don't have a google mexico site, but have a USA site, so we may rank, but not as well as if we had the mexico site. IS there anyway to improve rankings or tips? THanks! Laura Robinson

    | lauramrobinson32
    1

  • I have the need to find the best solution to move my viverezen.org blog on new domain naturazen.org because somebody stolen my brand. Now I registererd brand NaturaZen and I am going to use this website as main and have the old viverezen just to point in the new website I dont want lose autority and more important I dont want lose the 500 visits I have everyday. Both domain are under same hosting company What is best SEO solution you can give me to help? I thought to point the hosting on new domain naturazen and put all link with redirect 301 on viverezen but probably I am wrong stuck_out_tongue thanks for your help

    | VivereZen
    0

  • Hi, I'm working on a site that was last optimized some years ago. It has a fair number of pages that the url, h1, title tag and image alt exact match. Although this comes back as A+ in Moz's on page grader,  it seems a bit much. What do you think, is all this too heavy an SEO fingerprint for Google?

    | 94501
    0

  • Hi Moz community, I have a website that has a large database with 800+ important pages, and want Google to know when people visit and stay on these pages.  However, these pages are only accessible to people once they create an account with a password, and sign in. I know that since these pages are password protected, Google doesn't index them, but when our visitors stay for a while on our site browsing through our database, does this data get included in our CTR and Bounce Rate by Google?  This is really important for Google to know about our database (that people are staying on our site for a while) for SEO purposes, so I wanted to know that if the CTR gets measured even though these pages aren't crawled. Thanks for the help!!

    | danstern
    0

  • Hi there, We are going to switch our local domain oldsite.at to newsite.com in November. As our IT department wants to use the newsite.com already for email traffic till then, the domain newsite.com has to be accessible for public and currently shows the default Apache page without useful content. The old domain has quite some trust, the new domain is a first time registered domain (not known by search engines yet and no published anyhow). The domain was parked till now. I am aware of the steps to take for the switch itself, but: **what to do with the newsite.com domain until everything is prepared for the switch? **I suppose users or search engines find the domain and as there is no useful information available it harms us already. My idea was to 307 redirect newsite.com to the oldsite.at but the concern is that this causes problems as soon as we switch the domain and redirecting with 301 from oldsite.at to newsite.com? Do you have any objections or other recommendations? Thank you a lot in advance.

    | comicron
    0

  • I am planning to allow Guest posts on my website blog where authors or writers can post articles to my blog. All content will be manually approved by me. But i would like to know if guest posts will cause any harm to my site or is it good way to generate content. Another thing as guest posts will be having do-follow links & author section too will be up. So firstly i would like to know whether its a good step or not? Would even like to know what checks should i do before approving a guest posts?

    | welcomecure
    0

  • Hello Mozzers I had one small question to ask regarding app deep linking. I noticed websites ike http://www.huffingtonpost.com/ & http://www.trulia.com/ only include app deep links within desktop (www) versions of their websites but not include them on their mobile (m.) versions. Is this the best way to implement app deep links? Shouldn't websites include app deep links from both mobile and desktop versions of their website. Any help or tips will be highly appreciated. Thank you mozzers in advance.

    | Vsood
    2

  • I am going to be merging two sites. One is a niche site, and it is being merged with the main site. I am going to be doing 301 redirects to the main site. My question is, what is the best way of redirecting section/category pages in order to maximize SEO benefits. I will be redirecting product to product pages. The questions only concerns sections/categories. Option 1: Direct each section/category to the most closely matched category on the main site. For example, vintage-t-shirts would go to vintage-t-shirt on main site. Option 2: Point as many section/category pages to larger category on main site with selected filters. We have filtered navigation on our site. So if you wanted to see vintage t-shirts, you could go to the vintage t-shirt category, OR you could go to t-shirts and select "vintage" under style filter. In the example above, the vintage-t-shirt section from the niche site would point to t-shirts page with vintage filter selected (something like t-shirts/#/?_=1&filter.style=vintage). With option 2, I would be pointing more links to a main category page on the main site. I would likely have that page rank higher, because more links are pointing to it. I may have a better overall user experience, because if the customer decides to browse another style of t-shirt, they can simply unselect the filter and make other selections. Questions: Which of these options is better as far as: (1) SEO, (2) User experience If I go with option 2, the drawback is that the page titles will all be the same (i.e vintage-t-shirts pointing to the page with filter selected would have "t-shirts" as page title instead of a more targeted page with page title "vintage t-shirts." I believe a workaround would be to pull filter values from the URL and append them to the page title. That way page title for URL t-shirts/#/?=1&filter.style=vintage_ would be something like "vintage, t-shirts." Is this the appropriate way to deal with it? Any thoughts, suggestions, shared experiences would be appreciated.

    | inhouseseo
    0

  • Hey All, So I know that this isn't a standard robots.txt, I'm aware of how to block or wildcard certain folders but I'm wondering whether it's possible to block all URL's with a certain word in it? We have a client that was hacked a year ago and now they want us to help remove some of the pages that were being autogenerated with the word "viagra" in it. I saw this article and tried implementing it https://builtvisible.com/wildcards-in-robots-txt/ and it seems that I've been able to remove some of the URL's (although I can't confirm yet until I do a full pull of the SERPs on the domain). However, when I test certain URL's inside of WMT it still says that they are allowed which makes me think that it's not working fully or working at all. In this case these are the lines I've added to the robots.txt Disallow: /*&viagra Disallow: /*&Viagra I know I have the solution of individually requesting URL's to be removed from the index but I want to see if anybody has every had success with wildcarding URL's with a certain word in their robots.txt? The individual URL route could be very tedious. Thanks! Jon

    | EvansHunt
    0

  • Hey everyone! In the interest of trying to be brief, here's the situation in my favorite form of communication, bullet points! Client has two sites; one is in English and one is in Japanese Each site is a separate URL, no sub-domains or sub-pages Each main page on the English version of the site has a link to the homepage of the Japanese site Site has decent rankings overall, with room for improvement from page 2 to page 1 No Hreflang tags currently used in links to the Japanese version from the English version Given that the site isn't really suffering for most rankings, would this be helpful to implement on the English version? Ideally, I'd like each link to be updated to the corresponding subject matter of the Japanese, but in the interim it seems like identifying to Google that the link on the other side is a different language might be helpful to both the user and to maybe help those rankings on page two creep a little higher to page one. Thanks for reading, I appreciate your time.

    | Etna
    0

  • Dear All, We have taken and a product called webacelator from our hosting UKfast and our ip address is changing. UKFasts asked to point DNS  to different IP in order to route the traffic through webacelator, which will enhance browsing speed. I am concerned, will this change effect our rankings. Your responses highly appreciated.

    | tigersohelll
    0

  • about 20% of total 500 products in our shop have a manufacturer product description, which appears repeatedly on maybe 10 other sites and also on ebay and amazon. Issue is just with our own brand. We have another brand website where we publish the same product descriptions as well (google knows that we own both sites). The product description was first published on shop website before anywhere else and we are ranking number one for the product despite quite strong competition and we outrank our brand website as well. Which of the following options would you opt for? keeping the description and just adding some unique content replacing the complete product description by a new one

    | lcourse
    0

  • Hi All, Trying to figure out the best option here. I have a website that used to utilize a separate mobile site (m.xyz.com) but now utilizes responsive design. What is the best way to deal with that old mobile site? De-index? 301 redirect back to the main site in the rare case someone finds the m. site somewhere? THanks! Ricky

    | RickyShockley
    0

  • Hi everybody, I noticed that a lot of websites prefer their meta description would be the first words of the content inside.
    I on the other hand thought that google will prefer the meta description to be like a peek to what going to be inside.
    anyone can explain me, what is better? Thanks 🙂

    | roeesa
    0

  • I have a website that is not very pretty but has great rankings. I want to redesign the website and loose as little rankings as possible and still clean up the navigation. What are the best practices? Thanks in advance.

    | JHSpecialty
    0

  • Hello, I am planning to add to youtube a video with a specific long tail keyword as title and the same keyword mentioned in description, I believe that way it might show in google in the part where videos show up, I believe this could work well? do you have any advice or comment on this sort of idea? Thanks a lot.

    | bidilover
    0

  • I have a client that is OBSESSED with KWP ranking (don't go there...I know) This client offers multiple services, dog boarding, dog grooming, dog training, dog daycare and dog walking. Essentially these are our focus.  She ranks on page one for all of these words (locally of course) BUT she wants to rank in positions 1 and 2 for all of these words. Here's my rub, with her limited budget, we focus on 1 word (and associated long tails like "dog boarding in the south loop) and it takes a couple of months to zoom up to positions 1 or 2 (not counting map pack....she wants ORGANIC) While we're focusing on this 1 word, the others maintain their ranking or slip a few spots (like from 6 to 😎 Conversions average about about 1 a day, organic traffic is roughly 1000 hits a month. In your opinion is it better  to split this focus between the 5 target words every month, more slowly building ranking, but maintaining it for longer periods of time.  Or do it the way we have been chase dog boarding, then chase training, and so on. It just seems like we are CONSTANTLY chasing something while something else falls. Thanks Tracy

    | lkilera
    0

  • I have a backlink that I found from http://onucdm.onu.edu/mt/hmlref/2010/10/see_below.html . . . it is a nofollow, so it doesn't seem to be doing much harm. But this is obviously spam. It is from a .edu, so I'm not sure what I should do. Think it's harming my site at all? In general, how do you determine what to disavow?

    | KevinViner
    0

  • Hi Mozzers, I feel I am facing a double edge sword situation. I am in the process of migrating 4 domains into one. I am in the process of creating URL redirect mapping The pages I am having the most issues are the event pages that are past due but carry some value as they generally have one external followed link. www.example.com/event-2008 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2007 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2006 301 redirect to www.newdomain.com/event-2016 Again these old events aren't necessarily important in terms of link equity but do carry some and at the same time keep adding multiple 301s  pointing to the same page may not be a good ideas as it will increase the page speed load time which will affect the new site's performance. If i add a 404 I will lose the bit of equity in those. No index,follow may work since it won't index the old domain nor the page itself but still not 100% sure about it. I am not sure how a canonical would work since it would keep the old domain live. At this point I am not sure which direction I should follow? Thanks for your answers!

    | Ideas-Money-Art
    0

  • I'd like to get some input from the Moz community about the domain name I use on a travel website I run as a hobby. I got heavily whacked by an update in September 2012 which some have said was because my site is an EMD. Others said it was because I had poor quality backlinks (but in fact I hardly had any). With the benefit of hindsight, I'd love to know what really happened. The website is www.traveltipsthailand.com (now www.asiantraveltips.com) and the "brand" I use is "Travel Tips Thailand.The traffic penalty I incurred was around 80% and despite a LOT of work overhauling the site and trying to build some better quality links, I don't believe it has really recovered much. It ranks for non-competitive, low-traffic key phrases (which means it's not penalised as such), but struggles to rank anywhere meaningful on any phrase likely to drive traffic to the site. At this stage I really just want to know whether to persist with the site (it's heartbreaking, to be honest) or drop it an build something new from scratch. I monitor the site's progress using Moz Pro, so I can see all the search ranking, authority and backlink data. 5254ab15dcaa91-52423790

    | Gavin.Atkinson
    0

  • Our site uses Wordpress. The code is somewhat heavy. The text to code ratio for the home page is only 16%. Our developer suggests that we modify the code so that the important text appears at the top of the page (without changing the design) so that Google can index it more easily. My developer feels this would be more beneficial for SEO. He believes that reducing the code would create HTML errors. The home page is www.nyc-officespace-leader.com Is this approach sound? My developer describes it in the following manner: | Let me say that I don’t believe the text to code ratio has a significant impact on SEO per se but of course that reducing code, it will reduce page weight therefore it may help to improve ranking. See Homepage for example, this is the top landing page of your site, therefore it is very relevant to optimize. You can see the first block, from attached it has very little content and too many code. There is almost nothing to do about it, visually that is a very good block, in terms of SEO it isn't. I do not recommend to take it off just for SEO, that will make all pages with lot of text, lack of images and people may go away. On the other hand, most of the cases we want to improve text code ratio, there is an impact on unexpected BUGs because the code is being changed and this may affect functionality. I would suggest to spend time on improve the sort-order of the important content inside the code, so we may have similar text code ratio at the end but the important code we need Google to index will be at the very top in the source code, in terms of a very technical approach Google will find the key content faster and that should help to improve the crawling process as search engines read HTML code linearly. This change do not necessarily will affect the HTML, we can achieve it by using style sheet (CSS code) instead, reducing the chance of major BUGs. Either is our choice, we need to evaluate potential problems, code issues and content impact and also we need to apply changes and wait at least 3-4 weeks to start seeing results. It is a long task. Let me know your thought about this, we will estimate a task to improve code without affect web design |

    | Kingalan1
    0

  • Hi! I'm checking a site that has something like a News section, where they publish some posts, quite similar to a blog.
    They have a canonical url pointing to the page=1. I was thinking of implementing the rel=next/ prev and the view all page and set the view all page as the canonical. But, as this is not a category page of an ecommerce site, and it would has more than 100 posts inside in less than a year, It made me think that maybe the best solution would be the following Implementing rel=next/prev
    Keep page 1 as the canonical version. I don't want to make the users wait for a such a big page to load (a view all with more than 100 elements would be too much, I think) What do you think about this solution? Thank you!

    | teconsite
    0

  • Which theme is more SEO friendly and Fast loading? Both on desktop and Mobile http://demo.mythemeshop.com/blogging/2014/03/26/age-steel/ Or http://demo.tagdiv.com/newsmag/td-post-cruise-2015-swim-trend-blurred-lines/

    | Hall.Michael
    0

  • Hello, I have quite specific long term keyword (4 part keyword) for which I would like to rank as high as possible and other keywords would come automatically, I know there's lot to it how to do it properly, but is there any good tips you could help me out with? I have 4-5 different pages with the keyword related product, would it be smart to optimize them all for the one keyword or optimize just one of those pages and leave others with other information, this I believe would be important subject to decide? I know I could add the exact long term keyword since it's related to content to titles, h1 headers, alt tags , file names and url, but would it be smart to use the optimization for that exact long term keyword on all those pages or just one? This is very important subject for my business and any advice will be most highly valued. Many thanks

    | bidilover
    0

  • Hello I have noticed that for a keyword that has high competition it has on top image searches not that popular pinterest post & a reddit post, explorergram , youtube etc., the keywork is "24k gold iphone" and I am wondering if I could create somehow myself a pinterest or reddit post or something similar that would have images with my product rank high on that keyword, since my website does not rank well in mage search for some reason... https://www.google.fi/search?q=24k+gold+iphone+6&source=lnms&tbm=isch&sa=X&ved=0CAcQ_AUoAWoVChMI1f2LkpTxxgIVhI8sCh1SGwjy&biw=978&bih=550#tbm=isch&q=24k+gold+iphone thanks  a lot

    | bidilover
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.