
Moz Community,
The most talented people that are SEO experts are part of this community, so that is why I came here to find assistance. I am a Partner at https://3forty3.com/, we are an executive search firm based in San Francisco. At 3FORTY3, we work with the top VC firms in Silicon Valley serving venture-backed and growth-stage clients. We placed marketing executives at 8 of the top 25 Forbes Cloud 100 companies.
We are looking for an SEO consultant to help us with our website. Our phase 1 budget = $5K - $10K, but that would grow after phase 1. If interested, please email me directly at [email protected].
Thanks,
Jon Schepke
Partner, 3FORTY3 Executive Search
https://www.linkedin.com/in/jonschepke/

Hi All
Thank you in advance for any help.
Previously we were sending all keyword traffic to our homepage, targeting the main keyword garden rooms plus the seed keywords eg garden studios, garden offices etc.
We created 8 new pages, 4 for each main seed keyword and location and these went live on May 12th. The pages are indexed by google.
The issue is that all searches, except for garden annex brighton, are still pointing to the homepage and not the new location/service pages and now we're on July 27th it seems enough time has gone by.
We've setup this post to ask the question, what can we do to reinforce to google that we want the services pages listed in SERPS and not the homepage?
Here is the list of new pages : -
-
garden offices brighton
-
garden offices sussex
-
garden gyms brighton
-
garden gyms sussex
-
garden annexes brighton
-
garden annexes sussex
-
garden studios brighton
-
garden studios sussex
Many Thanks

Hello,
I have been publishing a good number of blogs on my site Flooring Flow. Though, there's been an error of the video viewport on some of my articles.
I have tried fixing it but the error is still showing in Google Search Console.
Can anyone help me fix it out?

I have a competitor website name Richmond Locksmith 24/7 How can i check its outbound links on moz??

I have a number of pages on the 4xx error report page that show a page tile saying 'Blacklisted' in MOZ on the critical errors page. What is this and how do I check it is really an issue and fix it? Some of the pages that show with this issue rank on page one in google.

Dear all,
We have two URLs:
The main URL which is crawled both by GSC and where Moz assigns our keywords is:
https://andipaeditions.com/banksy/
The second one is called a virtual url by our developpers:
https://andipaeditions.com/banksy/signedandunsignedprintsforsale/
This is currently not indexed by Google.
We have been linking to the second URL and I am unable to see if this is passing juice/anything on to the main one /banksy/
Is it a canonical? The /banksy/ is the one that is being picked up in serps/by Moz and worry that the two similar URLs are splitting the signal.
Should I redirect from the second to the first?
Thank you

Hi. One of the properties in our account has been reporting zero '0' total visits for the past few weeks. The other properties aren't affected. Is there a reason for this or is this an issue on the Moz side of things. Thanks!

Moz Pro is telling me that "get free high quality hd wallpapers designed home" is one of the top anchor texts in backlinks to my website, larrybohenwebsolutions.com. This anchor text has nothing to do with my website. How do I find out where this anchor text is used and how to fix?

Indexing backlinks plays a crucial role in the overall SEO strategy and search engine rankings. When search engines like Google crawl and index a website, they also consider the quality and relevance of the backlinks pointing to that site. Here's why indexing backlinks is important:
Visibility in Search Results: Indexing ensures that search engines recognize and attribute value to the backlinks you have acquired. When indexed, these backlinks contribute to the overall link profile of your website and can positively impact your visibility in search results.
Faster Indexation: By submitting your backlinks for indexing, you can speed up the process of search engines discovering and recognizing those links. This helps search engines recognize the relevance and authority of your website sooner.
Enhanced Crawling and Ranking: Indexing allows search engines to crawl and evaluate the backlinks pointing to your site. These backlinks are considered as signals of trust and authority, which can influence your search engine rankings.
Improved Domain Authority: When high-quality backlinks pointing to your site are indexed, they contribute to your website's Domain Authority (DA). A higher DA indicates a more authoritative and reputable website, which can positively impact your rankings and organic search visibility.
Competitive Advantage: Indexing your backlinks gives you a competitive edge by ensuring that the value and authority of those links are properly recognized and taken into account by search engines. This can help you outrank competitors who may have unindexed or low-quality backlinks.
It's important to note that not all backlinks may require manual indexing, as search engines can discover and index them naturally. However, for specific or newly acquired backlinks that may not be indexed quickly, manual submission or using indexing services can help ensure they are recognized by search engines and contribute to your overall SEO efforts.

Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated!
For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL.
My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error.
I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index.
Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this?
P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.