Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Complex URL Migration
Hi There, I have three separate questions which are all related. Some brief back ground. My client has an adventure tourism company that takes predominantly North American customers on adventure tours to three separate destinations: New Zealand, South America and the Himalayas. They previously had these sites on their own URL's. These URL's had the destination in the URL (eg: sitenewzealand.com). 2 of the three URL's had good age and lots of incoming links. This time last year a new web company was bought in and convinced them to pull all three sites onto a single domain and to put the sites under sub folders (eg: site.com/new-zealand). The built a brand new site for them on a Joomla platform. Unfortunately the new sites have not performed and halved the previous call to action rates. Organic traffic was not adversely affected with this change, however it hasn't grown either. I have been overhauling these new sites with a project team and we have managed to keep the new design but make usability/marketing changes that have the conversion rate nearly back to where it originally was and we have managed to keep the new design (and the CMS) in place. We have recently made programmatic changes to the joomla system to push the separate destination sites back onto their original URL's. My first question is around whether technically this was a good idea. Question 1 Does our logic below add up or is it flawed logic? The reasons we decided to migrate the sites back onto their old URL's were: We have assumed that with the majority of searches containing the actual destination (eg: "New Zealand") that all other things being equal it is likely to attract a higher click through rate on the domain www.sitenewzealand.com than for www.site.com/new-zealand. Having the "newzealand" in the actual URL would provide a rankings boost for target keyword phrases containing "new zealand" in them. We also wanted to create the consumer perception that we are specialists in each of the destinations which we service rather than having a single site which positions us as a "multi-destination" global travel company. Two of the old sites had solid incoming links and there has been very little new links acquired for the domain used for the past 12 months. It was also assumed that with the sites on their own domains that the theme for each site would be completely destination specific rather than having the single site with multiple destinations on it diluting this destination theme relevance. It is assumed that this would also help us to rank better for the destination specific search phrases (which account for 95% of all target keyword phrases). The downsides of this approach were that we were splitting out content onto three sites instead of one with a presumed associated drop in authority overall. The other major one was the actual disruption that a relatively complex domain migration could cause. Opinions on the logic we adopted for deciding to split these domains out would be highly appreciated. Question 2 We migrated the folder based destination specific sites back onto their old domains at the start of March. We were careful to thoroughly prepare the htaccess file to ensure we covered off all the new redirects needed and to directly redirect the old redirects to the new pages. The structure of each site and the content remained the same across the destination specific folders (eg: site.com/new-zealand/hiking became sitenewzealand.com/hiking). To achieve this splitting out of sites and the ability to keep the single instance of Joomla we wrote custom code to dynamically rewrite the URL's. This worked as designed. Unfortunately however, Joomla had a component which was dynamically creating the google site maps and as this had not had any code changes it got all confused and started feeding up a heap of URL's which never previously existed. This resulted in each site having 1000 - 2000 404's. It took us three weeks to work this out and to put a fix into place. This has now been done and we are down to zero 404's for each site in GWT and we have proper google site maps submitted (all done 3 days ago). In the meantime our organic rankings and traffic began to decline after around 5 days (after the migration) and after 10 days had dropped down to around 300 daily visitors from around 700 daily visitors. It has remained at that level for the past 2 weeks with no sign of any recovery. Now that we have fixed the 404's and have accurate site maps into google, how long do you think it will take to start to see an upwards trend again and how long it is likely to take to get to similar levels of organic traffic compared to pre-migration levels? (if at all). Question 3 The owner of the company is understandably nervous about the overall situation. He is wishing right now that we had never made the migration. If we decided to roll back to what we previously had are we likely to cause further recovery delays and would it come back to what we previously had in a reasonably quick time frame? A huge thanks to everyone for reading what is quite a technical and lengthy post and a big thank you in advance for any answers. Kind Regards
| activenz
Conrad0 -
Canonical Rel .uk and .au to .com site?
Hi guys, we have a client whose main site is .com but who has a .co.uk and a com.au site promoting the same company/brand. Each site is verified locally with a local address and phone but when we create content for the sites that is universal, should I rel=canonical those pages on the .co.uk and .com.au sites to the .com site? I saw a post from Dr. Pete that suggests I should as he outlines pretty closely the situation we're in: "The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site." Thanks in advance for your insight!
| wcbuckner0 -
Geo-targeting
Hi all, If I had a global domain but with local country pages on it, i.e. xxxx.com/uk/xxxx xxxx.com/usa/xxxxx xxxxx.com/au/xxxx What's the best way to ensure that the relevant country gets the relevant pages. I.e. the /uk/ pages show in the UK, /usa/ pages in the USA, /au/ pages in Australia. etc. etc. Is this a Google Webmaster tools setting? Thanks!
| Diana.varbanescu0 -
[E-commerce] Duplicate content due to color variations (canonical/indexing)
Hello, We currently have a lot of color variations on multiple products with almost the same content. Even with our canonicals being set, Moz's crawling tool seems to flag them as duplicate content. What we have done so far: Choosing the best-selling color variation (our "master product") Adding a rel="canonical" to every variation (with our "master product" as the canonical URL) In my opinion, it should be enough to address this issue. However, being given the fact that it's flagged as duplicate by Moz, I was wondering if there is something else we should do? Should we add a "noindex,follow" to our child products and "index,follow" to our master product? (sounds to me like such a heavy change) Thank you in advance
| EasyLounge0 -
.net has replaced our .com rankings - what the heck?
We have a www.domain.net that domain forwards to www.domain.com. About 5 days ago, when searching for our brand term, I noticed that www.domain.net took the top position, and most of our www.domain.com rankings have dropped. The www.domain.net is set to forward to www.domain.com. Any ideas what could cause something like this to happen?
| crapshoot0 -
Joomla duplicate content
My website report says http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad and http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad?limitstart=0 Has the same content so I have duplicate pages the only problem is the ?limitstart=0 How can I fix this? Thanks in advance
| kuavicrea0 -
How to properly implement HTTPS?
We are looking at implementing HTTPS for our site. I have done a little research but can't find anything recent, http://moz.com/community/q/duplicate-content-and-http-and-https is the most recent thing I found. Does everything in the answers still apply? Should I just do a 301 redirect to all new https? Or add a canonical tag?
| EcommerceSite0 -
Stolen website content
Hello, recently we had a lot of content written for our new website. Unfortunately me and my partner have went separate ways, and he has used all my unique content on his own website. All our product descriptions, about us etc, he simply changed the name of the company. He has agreed to take the content down, so that i can now put this content on our new website which is currently being designed. Will google see this as duplicate content as it has been on a website before? Even though the content has been removed from the original website. I was worried as the content is no longer "fresh" so to speak. Can any one help me with this,
| Alexogilvie0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
| jcgoodrich0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
| Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Should I redirect my xml sitemap?
Hi Mozzers, We have recently rebranded with a new company name, and of course this necessitated us to relaunch our entire website onto a new domain. I watched the Moz video on how they changed domain, copying what they did pretty much to the letter. (Thank you, Moz for sharing this with the community!) It has gone incredibly smoothly. I told all my bosses that we may see a 40% reduction in traffic / conversions in the short term. In the event (and its still very early days) we have in fact seen a 15% increase in traffic and our new website is converting better than before so an all-round success! I was just wondering if you thought I should redirect my XML sitemap as well? So far I haven't, but despite us doing the change of address thing in webmaster tools, I can see Google processed the old sitemap xml after we did the change of address etc. What do you think? I know we've been very lucky with the outcome of this rebrand but I don't want to rest on my laurels or get tripped up later down the line. Thanks everyone! Amelia
| CommT0 -
My Website Has a Google Penalty, But I Can't Disavow Links
I have a client who has definitely been penalized, rankings dropped for all keywords and hundreds of malicious backlinks when checked with WebMeUp....However, when I run the backlink portfolio on Moz, or any other tool, they don't appear anyone, and all the links are dead when I click on the actual URL. That being said, I can't disavow links that don't exist, and they don't show up in Webmaster Tools, but I KNOW this site has been penalized. Also- I noticed this today (attached). Any suggestions? I've never come across this issue before. xT6JNJC.png
| 01023450 -
SEO Implications of Moving Blog to Subdomain
Hello, We are having some issues upgrading our stack and maintaining Wordpress for our blog. So we are thinking about splitting them up. What are the SEO implications of moving our blog to a subdomain? Our blog URL structure is currently something like https://www.aplossoftware.com/blog/p/2470/fund-accounting/yearend-closing-checklist/. We would like to change to something like https://blog.aplossoftware.com/p/2470/fund-accounting/yearend-closing-checklist/
| stageagent0 -
Any reasons why social media properties are ranking higher than the site's own name?
The site below has social media properties and other sites coming up before it's own listing even for the exact search of the site name. Any ideas why this is happening? Link Any input is appreciated.
| SEO5Team0 -
Multiple Author Rich Snippets On A Q&A Forum Page
Hi, I work on a site that has a robust q&a forum. Members post questions and other members answer the questions. The answers can be lengthy, often by experts with Google+ pages and almost always by multiple member/commenters answering a particular question. Much like Moz's forum here. In order to get rich snippets results in search for a single Q&A page, what would happen if each of, for instance, 10 commenters on a page, were tagged as author? After all, the q/a forum pages have many authors, each as author of their own comments. Or, should I pick one comment out of many and call that member/commenter the author or something else? If it matters, the person asking the question in the forum is almost always not the expert providing a ton of detailed content. Also, a question might be 8 words. One answer might be 25 to 500 or more and their might be 5 to 10 different answers. Thanks! Cheers... Darcy
| 945010 -
Graphs - Interactive HTML5 or Image?
Hi, Every once in a while I have a need to add a graph / chart to my site.
| BeytzNet
Google offers a nice HTML5 chart builder and so do other web apps. The question is... Which should I use?
A pinnable and sharable image graph or Interactive html5 graph Thanks0 -
Credit Links on Client Websites
I know there have been several people who have asked this but a lot of them were back in 2012 before many of the google changes. My question is the same though. With all the changes with Google's algorithm. Is it okay to put your link on the bottom of your clients website. Like Web Design by, etc. Part of the reason is to drive traffic but also if someone is actually interested who designed the website, they will click it. But now reading about how bad links can hurt you tremendously, it makes me second guess if this is ok. My gut feeling says, no.
| blackrino0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
| Mivito0 -
URL Parameter Being Improperly Crawled & Indexed by Google
Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1) Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
| kevin_reyes
2) What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
a. Implement a self-referencing canonical on the page.
- This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
b. Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
- We did this, but given the canonical issue it won’t be effective until we can fix it.
c. URL handling change in WMT
- We made this change, but it didn’t seem to fix the problem
d. 301 or No Index the version with the email tracking parameters
- This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin0 -
Leverage Browser Caching: Do I need Last-Modified?
Per Page Speed recommendations I specified the Expires header in my .htaccess file. Do I need to add code for Last-Modified too? I thought I read somewhere that it will put the date next to the meta description in the SERPS, which might cause the result to seem outdated after a while. Are there any problems that could crop up if these aren't implemented correctly.
| kimmiedawn0 -
What are the ranking factors for "Google News"? How can we compete?
We have a few sport news websites that are picked up by Google News. Once in a blue moon, one of our articles ranks for a great keyword and shows in one of the 3 listings that Google News has in SERPS. Any tips on how we can we optimise more of our articles to compete in these 3 positions?
| betnl0 -
Does Bing support cross-domain canonical tag?
Hi folks, We are planning to implement a cross-domain canonical tag for a client and I'm looking for some information on bing supporting cross-domain canonical tag. Does anyone knows if there was a public announcement made by Bing or any representative about the support of this tag? Btw, the best info I've found is a Q&A here on Moz about it http://moz.com/community/q/does-bing-support-cross-domain-canonical-tags but I'm looking for a Bing information on the topic.
| fabioricotta-840380 -
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
| FireMountainGems0 -
Shell we provide dofollow outbound links from our website?
Hello, I am little confused about providing dofollow links from our website. We have a social shopping website where users can create catalog of their favorite products by bookmarking them from other websites, so our website might have thousands of outbound links. Now the confusion is whether we should have these links "nofollow" or "dofollow"? As per my understanding dofollow links will pass juice to other websites but on the other hand it might benefit us as well, sellers might come and bookmark their products for getting dofollow links. I read somewhere that if we have quality outbound links around a topic, google treats us as hub for that topic. But I am not clear if we will get this advantage only when these links are dofollow? Please help.
| saurabh19050 -
Rich Snippets stopped showing up in SERPS
i had some rich snippets (recipes nad stars) showing on my site, but the last few days they have gone, has anyone had this happen, if so what did you do to get them back? The example URL is as follows http://www.gourmed.gr/syntages/pestrofa-sto-tigani-synodeyomeni-me-sauvignon-2003-karypidis everything seems ok in Google Structured Data Testing Tool. Any thoughts on why?
| canonodigital0 -
Mobile version of my sites: What is better?
What is the best approach to make my sites ready for mobile, in terms of SEO ? Is it better to create a subdomain called "m.mydomain.com" and redirect mobile users to that domain with a lite version of my sites? Or is it better to just keep the same domain as for my desktop version "mydomain.com" and use a WordPress theme that fits for all gadgets, for example Twenty Fourteen WordPress Theme, that adapts to each device? I see that most big sites use a "m.mydomain.com" subdomain for the mobile version, however, I don't see any sense in creating a subdomain of the site, when you can just use the WP adapting theme in the main domain. Any insight please? Thanks!
| BloggerGuy0 -
How to tell the date a link was created
Does anybody know of a website that can let you know when an external link was created to a site? Or any other way of finding this info out. Thanks
| RobSchofield0 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
| khi50 -
Outbrain Select SEO Implications
A member of our marketing team wants to use Outbrain Select to curate content to augment the original content we have on our site. http://www.outbrain.com/select/how This content would be taken from other pages on the site and shown through as though it is on our site through javascipt. Obviously this page would be setup as a no-index. This seems to me like something Google would frown on. Does anyone know the SEO implications behind using a tool like this? I'm concerned google will see links to a blank page no-index page and find it suspect.
| LyntonWeb0 -
Wordpress site, MOZ showing missing meta description but pages do not exist on backend
I've got a wordpress website (a client) and MOZ keeps showing missing meta descriptions. When I look at the pages these are nonsense pages, they do exist somewhere but I am not seeing them on the backend. Questions: 1) how do I fix this? Maybe it's a rel con issue? why is this referring to "non-sense" pages? When I go to the page there is nothing on it except maybe an image or the headline, it's very strange. Any input out there I greatly appreciate. Thank you
| SOM240 -
Using pictures from another domain
We are building several sites for several clients which will be using images from the manufacturer. Our dev team wants to insert the manufacturer's url for the images, instead of actually downloading the image and hosting on our server. There are thousands of images, so downloading images to our server will be time consuming, so we are looking for a shortcut.... however I'm concerned this will cause other issues. Is using manufactueresdomain.com/12345.jpg going to cause SEO issues? will this generate Google penalties? Since we are not able to control the image file name, we cannot optimize it. We will add Alt text and Title tag for each image, but the file name is random characters. How important is the file name for SEO?
| Branden_S0 -
One word Keywords
Hey as you know that as a seo we are, we always optimize keywords which are at least 2 words, and lets say I'm trying to optimize a page for terms like "man clothing, man london clothing, man great collection, man stylus collection" and as you can guess I optimize this pages for this keywords by inputting them into title heading tags and body.
| atakala
So my question is , what if google takes "man" phrase from my 2 words keywords, and pretend as a my keyword. (I mean what if google thinks my keywords is man because as you can see in all of the keywords "man" is in all of them.)
And what if Google thinks the density of "man" probably would be %20 which is astronomic number.? Sorry for my bad english.0 -
Best way to start a fresh site from a penalized one
Dear all, I was dealing with a penalized domain (Penguin, Panda), hundred of spamy links (Disavoved with no success), tiny content resolved in part and so on .... I think the best way is to start a new fresh domain but we want to use some of the well written content from the old (penalized site). To do this task I will mark as NOINDEX the source (penalized) page and move this content to the new fresh domain. Question: do you think this is a non-dangerous aprouch or do you know other strategy? I'll appreciate your point of view Thank you
| SharewarePros0 -
Any recommended hosting company?
Any recommended hosting company, name of good package to buy ( shared hosting, VPS hosting, or Dedicated hosting). Which one to buy to help in website ranking.
| AlexanderWhite0 -
Low Page Authority in existing article in blog Any Ideas to improve it?
Im managing a blog that has a lot of articles with Page Authority 1.I have already checked with On-page Grader that these articles are Grade A, so they have the SEO structure perfect and would like to know any ideas to get this Page Authority rise in existing articles that are already written, like changes that can effectively be made this page authority get higher. Thanks in advance and regards, Jorge Pascual
| goperformancelabs0 -
TLDs vs ccTLDs?
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came. So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs. What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits? Thanks V j3LWnOJ
| venkatraman0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
| AdaptDigital0 -
Unknown factors affecting our SEO effort
Good morning / afternoon / evening all, We are continually working our website - www.movingeverywhere.co.uk , it has suffered some drastic drop in rankings with the last 2 google algorithm updates which we have been working to resolve. This has involved: Redesigning the website (responsive now) , increase of speed, reduction of code, better UX and generally better all round experience for the user. Signed up to Moz and resolved any issues which have been highlighted. (Hopefully fixed the last ones today) Investigated our inbound link profile to try and weed out any bad incoming links or any links that were damaging the site. Increased our social network profile and reach. We have done competitor analysis and we are beating all of our competitioers with on site factors as per Moz results but it appears we are missing something which means we are not reaping the fruits of our efforts at the moment. The site is wordpress and we read there could be a canonical issue with Wordpres ssites We are asking the Moz community for any guidance and assistance to try and diagnose any negative factors affecting the SEO effort on the site. Thank you for your time and help.
| wtfi0 -
Internal page links and possible penalties
If one looks at a page on our client's website, (http://truthbook.com/urantia-book/paper-98-the-melchizedek-teachings-in-the-occident for example), there are a huge amount of links in the body of the page. All internal links are normal links. All external links arerel="nofollow" class="externallink" We have two questions: 1. Could we be being penalized by google for having too many links on these pages? Will this show i our webmaster reports? 2. If we are being penalized, can we keep the links (and have no penalty) if we made the internal links rel="nofollow" class="externallink" as well? We need these internal links to help people use these pages as an educational tool. This is why these pages also have audio and imagery. Thank you
| jimmyzig0 -
Whats up with this website?
cybercig.co.uk Languishing around 150-200 in the rankings, very barely making it above 70. But also ranks for Refillable Electronic Cigarette on the first page. Any ideas whats happening? Not a huge amount of links but I'd have thought it would've been much higher. I'd love to know opinions 🙂
| jasondexter0 -
Should sub domains to organise content and directories?
I'm working on a site that has directories for service providers and content about those services. My idea is to organise the services into groups, e.g. Web, Graphic, Software Development since they are different topics. Each sub domain (hub) has it's own sales pages, directory of services providers and blog content. E.g. the web hub has web.servicecrowd.com.au (hub home) web.servicecrowd.com.au/blog (hub blog) http://web.servicecrowd.com.au/dir/p (hub directory) Is this overkill or will it help in the long run when there are hundreds of services like dog grooming and DJing? Seems better to have separate sub domains and unique blogs for groups of services and content topics.
| ServiceCrowd_AU0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
| Jessdyl0 -
Magento SEO firm
I'm looking for an SEO company that has substantial experience with the Magento shopping cart system. I've gone thru MOZ.com's Recommended List but I'm unsure of who specializes in Magento. Thanks.
| UncleXYZ0 -
SEO for interior page
Is it possible to be penalized on an interior page but not the whole website? Here's why I ask, I have a page: www.thesandiegocriminallawyer.com/domestic-violence.html that is not ranking well (p. 21 of Google) while the rest of the site ranks well (b/w p.1 to p.3). I checked the link profile in opensiteexplorer, ahrefs, and majesticseo but can't find any problems. I have also checked the HTML code, CSS, keyword optimization, but can't find any problems there either. Can anyone give me insight into why this might be happening? Of course, I'm working under the assumption that this page SHOULD be ranked higher for "San Diego Domestic Violence Attorney" - at least higher than page 21.
| mrodriguez14400 -
Is it a bad idea to use our meta description as a short description of a product on that product page?
Does this count as duplicating content even though the meta description has no effect on search results?
| USAMM0 -
3 Wordpress sites 1 Tumblr site coming under 1domain(4subdomains) WPMU: Proper Redirect?
Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less. What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!
| vmialik0 -
301 redirect for page 2, page 3 etc of an article or feed
Hey guys, We're looking to move a blog feed we have to a new static URL page. We are using 301 redirects but I'm unsure of what to regarding page 2, page 3 etc. of the feed. How do I make sure those urls are being redirected as well? For example: Moving FloridaDentist.com/blog/dental-tips/ to a new page url FloridaDentist.com/dental-tips. So, we are using a 301 on that old url to the new one. My questions is what to do with the other pages like FloridaDentist.com/blog/dental-tips/page/3. How do we make sure that page is also 301'd to the new main url?
| RickyShockley0 -
Competitors Showing in Branded Search w/in Google FR
Hi All, When searching for our brand in Google France, I noticed that some of our major competitors show up beneath our Knowledge Graph listing. My managers and I are wondering if this is something Google just does as associated search or if there's a way that we can work around it. Thanks and please see attached image 🙂 2KYBeiT
| CSawatzky0 -
Revisiting the dangers of PR Newswires
Hi, I've just been researching backlinks and newswires... One of them told me they put nofollow links on news releases, but some of their network would repro the news release without nofollow links. I suspect the network includes some not so brilliant websites, so even if I just use URL rather than anchor text for backlinks I'm thinking there probably is a risk. The other seemed to have more control over network, from what they said - they didn't auto-syndicate and used nofollow, but still I suspect there's a risk that your news release will end up on not so good websites. Has anybody out there recently experienced problems with the backlinks produced by newswire services? Beyond that, your general input would be welcome too.
| McTaggart0 -
Indexing Dynamic Pages
http://www.oreillyauto.com/site/c/search/Wiper+Blade/03300/C0047.oap?make=Honda&model=Accord&year=2005&vi=1430764 How is O'Reilly getting this page indexed? It shows up in organic results for [2005 honda accord windshield wiper size].
| Kingof50
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.