Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Moz Pro

Discuss the Moz Pro tools with other users.

Subcategories

  • Chat keyword research strategy and how Keyword Explorer helps you do your best work.

  • Cover all things links and the industry-leading link data discoverable in Link Explorer.


  • Hey Guys, I am about to begin to optimise this site www.stuartandersonkitchens.co.uk for some keywords such as edinburgh kitchens, quality kitchens edinburgh etc as he is based in edinburgh. I am also going to be doing some work behind the bones of the site optimisiing alt, meta data, micro data etc. Also I am thinking about creating a FB, Twitter, Youtube profile and possibly adding him to a few directories (DMOZ etc), some local & running a small adwords campaign using the freebie credits. I will be altering some of the content in order to have it more amicable to the engines. Is there any other advice anyone would like to give me, where I could improve in order to have him seen on the search engines. Be great to here your take on this. Thanks again, Craig

    | fenwaymedia
    0

  • I'm using the keyword difficultly tool to help me create a a list of  5 keywords (out of approx 50-60) to optimise pages for on a site.  However I don't want to just choose the top 5 if 3 of them are too competitive and not worth targeting. From anyone's experience, for a small, new web company who has no pages optimised at this point, do you think there is a keyword difficulty score that I should create a hard limit on? So for instance, with a group of keywords, to only target keywords that have a difficulty score of 60 or below because anything higher would be too difficult to optimise the pages for in this stage of the sites development. Thanks in advance for your help Michelle 🙂

    | artlivemedia
    0

  • In the crawl diagnostics for my campaign, the duplicate content warnings have been increasing, but when I look at the sample pages that SEOMoz says have duplicate content, they are completely different pages from the page identified. They have different Titles, Meta Descriptions and HTML content and often are different types of pages, i.e. product page appearing as having duplicate content vs. a category page. Anyone know what could be causing this?

    | EBCeller
    0

  • Both of them track keywords. In the Rank Tracker, you add each keyword manually and you associate it with a URL. For On-page Optimization page, the URLs are generated automatically based on searches and traffic?

    | ehabd
    0

  • I noticed the scheduled linkscape update was on August 1. My link report hasn't been updated since July 6. Did the index update occur on Aug. 1? If not when is it expected to occur? thanks

    | larahill
    0

  • Hi all, Currently working through the laundry list of errors and warning on our company's 24 websites. Due to the ridiculous amount of on page links and the sheer volume of products on our sites, much of the descriptive text is similar, following a strict pattern to best mention our USPs and the like. Of course we use a CMS, which means that all the pages look the same and draw this information from the style sheet. Anyways, to the problem at hand. I have been tasked with reducing the "error" count on the SEOmoz admin panel, the problem being SEOmoz is reporting duplicate page content, when they are different, but similar products, for example, 35, 45 and 55 litre refrigeration units. Is there a way in which I can specify what classes as duplicate content, or make the duplicate content report more restrictive, so that everything HAS to be the same for this error to show. Any help is much appreciated, thanks in advance.

    | cmuknbb
    0

  • Hi Guys, I am thinking about running a competition on my website in order to drive traffic to it, however the regulations that govern competitions around FB are convoluted to say the least. I am under the impression that I can advertise the competition on FB by direct them to an external site. Does anyone have any experience with this and can guide me, so I dont burn a well established Social Profile with over 2,200 connections? Thank you for any help you can give me. Kind Regards, Craig

    | fenwaymedia
    0

  • I have set the campaign for the Root domain, hw, I'm receiving issues for both subdomains, demo and www. Our site is set to redirect all non www traffic to www, except demo, who has its own rules. Is that normal? Should I contact the support team? Thanks

    | MilosMilcom
    0

  • I created a profile in April this year on CrunchBase for my company http://www.crunchbase.com/company/wallpapered but it is not appearing in the "inbound links" of Open Site Explorer. All the other companies I have checked in OSE have their CrunchBase profile in their inbound links (many share the same Page authority as mine). Any suggestions would be really helpful. Thanks

    | roberthseo
    0

  • Hello, I have had this service for my  free month. I had over 150 errors on my website which I fixed.  My problem is that  in doing this my traffic is actually worse and I have now fallen completely off google. From the start of my website I would slowly creep to within t he top 3 pages of google for the search term apron and aprons, then suddenly I fall to somewhere around page 20. No one, even people I hired has been able to explain this. I hear google changes, it takes time, etc, etc... Its been 5 years and other newer companies are on the  top page right in their first year.. After working with this site for a month, I have looked  to page 65 for apron and aprons and still cant find me. And I know some people say Alexa is a gimmick, but now instead of rising to 10 million and falling back down to 23 million, Alexa says there is no info regarding my  website whatsoever. Any advice? I was excited to have this service but I cant get my partner to agree to keep it if  my month long result is worse then I ever was before. And please dont say it takes time , 5 years is more than enough. Thank you in advance!

    | Gardengirl
    0

  • Hi, I've added a campaign to my account with the first crawl taking around a week. The 2nd crawl started 3days 17 hours ago and si still running. Is this something that others have experienced? The campaign is tracking 5 keywords and have 17 pages on the site. Steve

    | stevecounsell
    0

  • HI I have nick name JMHHACKER show up with a domain of 86 Do Moz of 6 MozTrust 6 External links 375k Total external 1.3 mil total links 3.6 mil root 43k c blocks 15k WHAT THE HECK does it mean all new to me

    | Jmhhacker
    0

  • What is the point of collecting mozpoints?  I read that you are able to purchase features, but what other perks are there with collecting mozpoints?

    | ReadyArtwork
    0

  • As the title states - We've recently developed two sites for clients - within the last 4 months or so. With the Google PR update, both sites are sitting as PR 5 sites. I've tried to have a look in the OSE for the backlink profile of both websites, but I see nothing.  Even Majestic SEO's fresh index doesn't provide much info. The DA of each site is 11-16. I would really love to see what's generating the link juice to these sites. Any ideas? The two sites are: https://bfore.co.za
    http://ictjournalafrica.net

    | Mark.RedGiant
    0

  • Como El "Crawl Diagnostics Summary"pero la verdad es que son las mismas paginas con el wwww y sin el www. que puedo hacer para quitar esta situacionde error.

    | arteweb2
    0

  • Hi All, Rogerbot has been reporting errors on our website's for over a year now, and we correct the issues as soon as they are reported. However I have 2 questions regarding the recent crawl report we got on the 8th. 1.) Pages with a "no-index" tag are being crawled by roger and are being reported as duplicate page content errors. I can ignore these as google doesnt see these pages, but surely roger should ignore pages with "no-index" instructions as well? Also, these errors wont go away in our campaign until Roger ignores the URL's. 2.) What bugs me most is that resource pages that have been around for about 6 months have only just been reported as being duplicate content. Our weekly crawls have never picked up these resources pages as being a problem, why now all of a sudden? (Makes me wonder how extensive each crawl is?) Anyone else had a similar problem? Regards GREG

    | AndreVanKets
    0

  • Hiya, Looking for some advice, i have a page which the on page optimization tool shows as an A grade and Google has indexed it. I have checked vie site: however is not being found in search results even for an exact match on the page title which is very specific. I believe the page may be being penlized for over optimisation? any advice would be great! URL is www.tots-away.com/child-friendly-holidays-spain/

    | iprosoftware
    0

  • A new campaign of mine is supposed to update its keyword rankings on Wednesdays but as of Thursday morning it didn't happen. I've got the original stats from the first few keywords I added when the campaign was set up, but nothing for the keywords that were added in the next day. Previous experience with this tool was pretty reliable so I'm just wondering if anyone is experiencing the same thing.

    | ninjaprecision
    0

  • Hi all, I've been waiting some days for the third crawl of my sites, but SEOMOZ only crawled 277 pages. The next phrase appeared on my crawl report: Pages Crawled: 277 | Limit: 250 My last 2 crawls were of about 10K limit. Any idea? Kind regards, Simon.

    | Aureka
    0

  • My crawl diagnosis has suddenly dropped from 10,000 pages to just 250. I've been tracking and working on an ecommerce website with 102,000 pages (www.heatingreplacementparts.co.uk) and the history for this was showing some great improvements. Suddenly the CD report today is showing only 250 pages! What has happened? Not only is this frustrating to work with as I was chipping away at the errors and warnings, but also my graphs for reporting to my client are now all screwed up. I have a pro plan and nothing has (or should have!) changed.

    | eseyo
    0

  • I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!

    | prima-253509
    0

  • How many keywords/domains can I track using this tool on the Pro membership? Looking at the tool itself, it looks like 100 but in the 'help' section it states up to 300?

    | dentaldesign
    0

  • Recently, I read an article a post here in SEOmoz blog talking about competitive backlink analysis. (http://www.seomoz.org/blog/guide-to-competitive-backlink-analysis). Could anyone please indicate me the way to retrieve the next two metrics from the SEOmoz account? Number of Linking Root Domains using EXACT matct anchor text. Percentage of anchors with EXACT match anchor. Thank you in advance!

    | KostasKostalampros
    0

  • Howdy folks, we've been a PRO member for about 24 hours now and I have to say we're loving it! One problem I am having with however is a CSV exported from our crawl diagnostics summary that I've downloaded. The CSV contains all the data fine, however I am having problems with it when a URL contains a comma. I am making a little tool to work with the CSVs we download and I can't parse it properly because there sometimes URLs contain commas and aren't quoted the same as other fields, such as meta_description_tag, are. Is there something simple I'm missing or is it something that can be fixed? Looking forward to learn more about the various tools. Thanks for the help.

    | Safelincs
    0

  • I started running a SERP analysis report two days ago, and it still says it's "in progress." It is 5:40 PM Pacific on August 7 as I type this, and I started running the report on: August 05, 2012 10:03 (copied and pasted from the report). Is this normal? Thanks!

    | ScottShrum
    0

  • Hello, My website is categorized into 2 main categories. Sci/Tech (Has 4 sub-categories) Gadgets(Has 2 sub-categories) The Crawl diagnostic tool shows "Duplicate Page Title" error on Gadget's sub-categories while there's no error on the Sci/Tech. I don't really know how to get rid of these errors. Anyone has a solution to this?

    | MighteeObvious
    0

  • Hi Guys Our seomoz campaign report is returning a lot or Rel Canonical issues similar to this for each page. The non / version redirects to the / version but how do I get the ones with search parameters ie '?datefrom&nights' to redirect. http://www.lamangaclubresort.co.uk/accommodations/las-brisas-78
    http://www.lamangaclubresort.co.uk/accommodations/las-brisas-78/
    http://www.lamangaclubresort.co.uk/accommodations/las-brisas-78/?datefrom&nights
    http://www.lamangaclubresort.co.uk/accommodations/las-brisas-78/?datefrom=&nights= Any help would be welcome, thanks

    | JohnTulley
    0

  • About how long does it take Google WMT to refresh stats on "Links to Your Site"?  We're dealing with an unnatural link/anchor phrase issue and I'm curious as to the "typical" time it takes for Google to recognize the links removal or Anchor Text change. Any refresh time ideas on OpenSiteExplorer or AHREFS as well would be a plus... Thanks! Dan Using this guide (very helpful thanks SEOmoz!) http://www.seomoz.org/blog/identifying-link-penalties-in-2012

    | MTteam
    0

  • Our site is about 2million pages deep, 50% of which is stale content. Yes, I know - OMG #unhygienic. Even if we get approval to get rid of half of it. SEOMoz Pro Elite only crawls 20k deep - what can i do to crawl and diagnose the whole site. Are there any tools anyone can suggest. SEOMoz??

    | ilhaam
    0

  • OK, I've been making use of the free LinkScape API (on behalf of a client of mine) and trying to get links (and info on those links) to a specific domain/page/etc. NOTE : I've been using it without any issue in the past, however we are currently facing some weird issues. Let's take this simple query as an example : http://lsapi.seomoz.com/linkscape/links/wikipedia.org?SourceCols=4&TargetCols=4&Sort=page_authority&Scope=page_to_domain What this one supposedly does is to get links to "wikipedia.org", right? I'm reading : The Page_to_* scopes will by default return 25 links per source domain if no limit is specified, so you can see domain diversity. Due to space limitations in our API, a general link query for a given page will return at most 25 pages for every unique domain linking to that page. And I'm saying OK, that's fine. The thing is that (instead of the 1000 links I had been getting before), I'm now getting just 25 links. NOT per... "source domain"... but obviously per "target domain" (= wikipedia.org) - or am I missing something? (well, probably wikipedia suddenly has just about 25 links pointed to it... makes sense! 🙂 ) Please, let me know what's going on with the above, simply because getting just 25 links is close to worthless... Thanks a lot, in advance!

    | drkameleon
    0

  • Hello, Im having a problem with the data I pull off from Open Site Explorer. Everytime I download a report in CSV, when I open it in Excel 2007, all the information is like this: http://i.imgur.com/rwMxO.png What can I do to extract the information exactly as it appears in the Open Site Explorer with all the fields in the right place? Tks guys, Regards, Pedro Pereira [](<a href=)" target="_blank">a> rwMxO.png

    | PedroM
    0

  • Hi everyone, I found this site and it looks like a great place for us to start, we have just opened the doors to our new website fenwaymedia thats a dot co dot uk. Now Ihave tried to maintain the most minimilistic design throughout but I believe with this comes certain drawbacks -  - - - S E O which is why i am here. My thought were that I would draw my clients from the Web Design, web Marketing & Blog page which I will starting in 3 hours be constantly updating. but Im not sure Im doing the right thing. I am trying to target Edinburgh, Scotland although we would take work from anywhere, if anyone could spare us a moment, I would greatly appreciate any help, tips or comments that might help us attain some visibility. Thanks in advance and thanks for looking. Kind regards, Craig

    | fenwaymedia
    0

  • From July 30th to the 31st many of my keyword rankings in Google plumited 30 spots or more.  Has anyone else notice this on their site?  Does anyone have any idea for the presumed penalty? I can PM my site information if needed for further assistance.  As always thank you for the assistance, and any help in this matter is greatly appreciated!

    | BethA
    1

  • Hello, We have a serious issue with 404s and recently saw our Moz Page rank fall from 53 to 47. 1. OSE Inbound links no longer shows any of our Linked In posts, did Linked In stop passing juice? 2. Does SEO Moz reduce your ranking when there's a sudden increase in 404s? 2a. WP Yoast SEO - I accidentally checked the box on this plugin to "Strip the category base (usually /category/) from the category URL" which basically caused all of our blog post categories and Datafeedr categories to disappear. Didn't realize till too much time had passed that I accidentally clicked that box. Datafeedr is a plugin for our estore that parses the data feeds from affiliate vendors and allows you to create a saved search that auto updates old products every 3 days. I had a no index/follow parameter on the category items, but seeing the # of 404s continue to increase, I temporarily removed this parm last week to see if it reduces this now static number of 404s. Google Webmaster tools started showing a ton of soft 404s that kept increasing, while SEO Moz didn't show any of those 404s. I didn't pay much attention to GWT since Google kept saying it won't affect our rankings, and nothing was showing up on SEO Moz. Last week a fraction of those 404s showed up and I am not sure if that's what lowered our Moz rank or what looks like a possible delinking from Linked In and a higher ranking complimentary website directly related to our field itsallaboutyoga. Looking at the Moz graph of "Total Linking Root Domains Over Time" all of our competitors took a similar % hit since between June and the end of July, so I am thinking its more wide based than fat fingered mistake. I fixed # 2, (have to still figure out what to do with most of those 404s, thinking of submitting a request to Google vs 1,000s of 301s) so in doing my review of this sequence of events and using it as a learning experience, where would I assign max destructive value as a percentage? A. Ignoring GWT soft 404s in favor of SEO Moz campaign reports B. No follow from Linked In and related industry site C. Datafeedr, thousands of indexed products through Datafeedr that are no longer available mostly due to WP Yoast SEO fat finger error. I did have the D. WP Yoast SEO, "Strip the category base (usually /category/) from the category URL" E. Global Google algo change Cheers, Michael

    | MKaloud
    0

  • Apologies if this has already been answered a million times and/or if I'm posting this in the wrong place... I did look, but couldn't find the answer anywhere... I just realized that all this time I have been tracking www.mydomain.com in my campaign for a long time now, while I should probably have been tracking *.mydomain.com. I wanted to change this, but it looks like I can't. If I have to set up a whole new campaign, with all of the same keywords, competitive sites to track, etc., that will take forever. Why can't I just change it so that my campaign starts tracking *.mydomain.com instead of www.mydomain.com from now on? Thank you!

    | ScottShrum
    0

  • I'm new to SEOMoz and trying to find something like SEMRush for finding what keywords competitors use. Is there anything like that within SEOMoz?

    | seanuk
    0

  • Hi, Every time i try to create an advanced report with Opensite it will get to around 4-6k links and then start to finalize the report, however it says i have 750k links aiming at my root domain. I have not used any of the page/domain authority filters, any ideas on why this could be cutting me off? Kyle

    | kyleNeedham
    0

  • Does SEOMoz archive that information somewhere?

    | CreateForLess
    0

  • Hi Before my time, company big.com took over company small.com. They decided to replicate big.com web pages onto small.com - so both websites have identical pages and copy, just different domains. Within the small.com sitmap.xml they list only big.com urls. They are also using big.com google analytics tracking code on small.com. I have no idea what happened to the original content on small.com or if they put 301 redirects on. I am thinking: do a 301 redirect on small domain to big domain. A) Agree? Small domain is likely to have valuable historic inbound links which are now going to 404 pages. After I do the 301 should these then appear in big.com SEOMoz campaign and on big.com webmaster tool for me to fix? B) Views? Or should I get up webmaster tool on small.com and fix that first? C) Views? Many thanks in advance guys, sorry its a long statement! Richard

    | Richard555
    0

  • The moz difficulty score considers four factors for the top websites. are the on page factors included in the page authority data ?

    | iQuanti
    0

  • Has anyone ever experienced a domain authority of 1, 0 links for web 2.0 websites like tumblr, wordpress, weebly, etc? I know it's normal to see a page authority of 1 if there are no links to a url or if opensite explorer hasn't been updated. Is this some kind of bug with opensite explorer?

    | theanglemedia
    0

  • Hi Recently set up my SEOM MOZ account so any help would be great! On the competitive domain analysis page the sub domain metrics appear as good (if not better) then the domain.The Moz rank and trust are better on the sub domain. I know the website copy is appearing on both non-www and also www (which I assume is being referred to as the subdomain) on the competitor domain analysis page). Should I now 301 the WWW site to the non-www.... which will then concentrate the SEO and SEOMoz then will only have the root domain metrics appearing? Many thanks for your help in advance!

    | Richard555
    0

  • Hi guys, I've noticed in my crawl reports that some URL's seem to have inline Javascript in them... The JS doesn't work, but it does cause the link to 404. I'm not sure where the links have come from - it's only affecting really old blog posts made before my time here. I'm contemplating deleting them... Here's an example: | http://www.evoenergy.co.uk/blog/author/aaron/page/76/ window.open('http%3A/www.lime.com/redirect/pubs.acs.org/'); void(0) | Any help would be appreciated! Thanks

    | tomcraig86
    0

  • I've had an issue with to many links on the site. My drop down menu, secondary footer and footer. The report told me that I had 253 links on each page. I then programmed my secondary footer to dynamic and ran a crawl and my links reduced accordingly to 201. Then turned the footer into dynamic and ran a crawl with my links increasing to 1500. This also happened between each phase but en went away. Oddly enough, my domain authority increased as well as other factors in the crawl report. This too many links thing is driving me crazy. Please provide some guidance.

    | CHADHARRIS
    0

  • SEOMoz tool reports show lots of duplicate content where there are http header directives in place on pages to eliminate dupes. Googlebot obeys but Roger the robot doesn't. Are header directives such as X-Robots and Link (rel=canonical) supported by OSE/Linkscape? I'd like to put my mind and clients at ease. Thanks

    | Mediatorr
    0

  • I see a lot of answers to questions in here that I would consider pretty umm, well, not all that great, not that I am one to judge, but when I ask a question I am looking for answers from experienced professionals. How do you decide which answers to take more seriously?  I check the persons profile and follow it all the way out the their site and blogs (if listed).  Sometimes I see sites that have a SEmoz of 10. How can you give advice when your site has no Ranking? I know there are going to be exceptions, and that this isn't the one and only indicator, but how do you decide how much confidence you can put in an answer here?

    | MBayes
    3

  • I ran an advanced report to show me all the backlinks pointing to a domain. When I go to many of the domains listed, I can't find the link. I've searched the pages by anchor text in the browser and nothing comes up. Anyone know why this would be?

    | PatioLifeStyle
    0

  • Is there any way to search for best local keywords for my local salon business, in NSW (Australia) I tried searching for keywords using Google adwords keywords tool by selecting country as Australia but again there is no way to find keywords for a specific area or few cities in NSW

    | Visiblics
    0

  • Hi all, I keep getting the following error when trying to add my Facebook page.  It worked fine in the past and has suddenly stopped working: The webpage at https://graph.facebook.com/oauth/authorize?client_id=142287725855094&redirect_uri=http%3A%2F%2Fpro.seomoz.org%2Fcampaigns%2F173488%2Fsocial%2Fcreate%2Ffacebook%2F127833296954.html&scope=read_stream%2Cuser_videos%2Cuser_photos%2Cuser_photo_video_tags%2Cmanage_pages%2Cread_insights has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer. I've tried clearing cache and deleting cookies.  Any other ideas I would try? Thanks!

    | kenc138
    0

  • Hi everyone, I've posted before on this and I'm still not satisfied that the answer is correct. Basically, my ranking report is saying that hardly any of my keywords are ranking in the top 50 of Google UK. Now, I appreciate that rankings can be affected by my search history, but I've tried other computers, other IP addresses, logged out of Google, cleared cache etc.. and I'm still seeing the terms I'm wanting to rank for  on the first and second pages of Google UK. What is going on? My boss is wanting to see progress with these terms - I'm confident that we're ranking well, but the report is saying otherwise.

    | columbus
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.