Enterprise Local SEO is Different: A Checklist, a Mindset

Posted by MiriamEllis

Image credit: Abraham Williams

If you’re marketing big brands with hundreds or thousands of locations, are you certain you’re getting model-appropriate local SEO information from your favorite industry sources?

Is your enterprise checking off not just technical basics, but hyperlocalized research to strengthen its entrance into new markets?

Before I started working for Moz in in 2010, the bulk of my local SEO experience had been with small-to-medium business models. Naturally, the advice I was able to offer back then was limited by the scope of my work. But then came Moz Local, and the opportunity to learn more about the more complex needs of valued enterprise customers like Crate & Barrel with more than 170 locations, PAPYRUS with 400, or Bridgestone Corporation with 2000+.

Now, when I’m thumbing through industry tips and tactics, I’m better able to identify when a recommended practice is stemming from an SMB mindset and falling short of enterprise realities, or is truly applicable to all business models. My goal for this post is to offer:

  • Examples of commonly encountered advice that isn’t really best for big brands
  • An Enterprise Local SEO Checklist to help you shape strategy for present campaigns, or ready your agency to pursue relationships with bigger dream clients
  • A state-to-enterprise wireframe for initial hyperlocal marketing research

Not everything you read is for enterprises

When a brand is small, like a single location, family-owned retail shop, it’s likely that a single person at the company can manage the business’ Local SEO, with some free education and a few helpful tools. Large, multi-location brands, just by dint of organizational complexities, are different. Before they even get down to the nitty gritty of building citations, enterprises have to solve for:

  • Standardizing data across hundreds or thousands of locations
  • Franchise relationships that can muddy who controls which data and assets
  • Designating staff to actually manage data and execute initiatives, and building bridges between teams that must work in concert to meet goals
  • Scaling everything from listings management, to site architecture, to content dev
  • Dealing with a hierarchy of reports of bad data from the retail location level up to corporate

I am barely scratching the surface here. In a nutshell, the scale of the organization and the scope of the multi-location brand can turn a task that would be simple for Mom-and-Pop into a major, company-wide challenge. And I think it adds to the challenge when published advice for SMBs isn’t labeled as such. Over the years, three common tips I’ve encountered with questionable or no applicability to enterprises include:

Not-for-enterprises #1: Link all your local business listings to your homepage

This is sometimes offered as a suggestion to boost local rankings, because website home pages typically have more authority than location landing pages do. But in the enterprise scenario, sending a consumer from a listing for his chosen location, to a homepage, and then expecting him to fool around with a menu or a store locator widget to finally reach a landing page for the location he’s already designated that he wanted is not respecting his user experience. It’s wasting his time. I consider this an unnecessary risk of conversions.

Simultaneously, failure to fully utilize location landing pages means that very little can be done to customize the website experience for each community and customer. Directly-linked-to landing pages can provide instant, persuasive proofs of local-ness, in the form of real local reviews, news about local sponsorships and events, special offers, regional product highlights, imagery and so much more that no corporate homepage can ever provide. Consider these statistics:

“According to a new study, when both brand and location-specific pages exist, 85% of all consumer engagement takes place on the local pages (e.g., Facebook Local Pages, local landing pages). A minority of impressions and engagement (15%) happen on national or brand pages.Local Search Association

In the large, multi-location scenario, it just isn’t putting the customer first to swap out a hoped-for ranking increase for a considerate, well-planned user experience.

Not-for-enterprises #2: Local business listings are a one-and-done deal

I find this advice particularly concerning. I don’t consider it true even for SMBs, and at the enterprise level, it’s simply false. It’s my guess that this suggestion stems from imagining a single local business. They create their Google My Business listing and build out perhaps 20–50 structured citations with good data. What could go wrong?

For starters, they may have forgotten that their business name was different 10 years ago. Oh, and they did move across town 5 years ago. And this old data is sitting somewhere in a major aggregator like Acxiom, and somehow due to the infamous vagaries of data flow, it ends up on Bing, and a Bing user gets confused and reports to Google that the new address is wrong on the GMB listing … and so on and so on. Between data flow and crowdsourced editing, a set-and-forget approach to local business listings is trouble waiting to happen.

Now multiply this by 1,000 business locations. And throw in that the enterprise opened two new stores yesterday and closed one. And that they just acquired a new chain and have to rebrand all its assets. And there seems to be something the matter with the phone number on 25 listings, because they’re getting agitated complaints at corporate. And they received 500 reviews last week on Google alone that have to be managed, and it seems one of their competitors is leaving them negative reviews. Whoa – there are 700 duplicate listings being reported by Moz Local! And the brand has 250 Google Questions & Answers queries to respond to this week. And someone just uploaded an image of a dumpster to their GMB listing in Santa Fe…

Not only do listings have to be built, they have to be monitored for data degradation, and managed for inevitable business events, responsiveness to consumers, and spam. It’s hard enough for SMBs to pull all of this off, but enterprises ignore this at their peril!

Not-for-enterprises #3: Just do X

Every time a new local search feature or best practice emerges, you’ll find publications saying “just do X” to implement. What I’ve learned from enterprises is that there is no “just” about it.

Case in point: in 2017, Google rolled out Google Posts, and as Joel Headley of healthcare practice growth platform PatientPop explained to me in a recent interview, his company had to quickly develop a solution that would enable thousands of customers to utilize this influential feature across hundreds of thousands of listings. PatientPop managed implementation in an astonishingly short time, but typically, at the enterprise level, each new rollout requires countless steps up and down the ladder. These could include achieving recognition of the new opportunity, approval to pursue it, designation of teams to work on it, possible acquisition of new assets to accomplish goals, implementation at scale, and the groundwork of tracking outcomes so that they can be reported to prove/disprove ROI from the effort.

Where small businesses can be relatively agile if they can find time to man-up to new features and strategies, enterprises can become dangerously bogged down by infrastructure and communications gaps. Even something as simple as hyperlocalizing content to the needs of a given community represents a significant undertaking.

The family-owned local hardware store already knows that the county fair is the biggest annual event in their area, and they’ve already got everything necessary to participate with a booth, run a contest, take photos, sponsor the tractor pull, earn links, and blog about it. For the hardware franchise with 3,000 stores, branch-to-corporate communication of the mere existence of the county fair, let alone gaining permission to market around it, will require multiple touches from the location to C-suites, and back again.

Checklist for enterprise local SEO preparedness

If you’re on the marketing team for an enterprise, or you run an agency and want to begin working with these larger, rewarding clients, you’ll be striving to put a checkmark in every box on the following checklist:

☑ Definition of success

We’ve determined which actions = success for our brand, whether this is increases for in-store traffic, sales, phone calls, bookings, or some other metric. When we see growth in these KPIs, it will affirm for us that our efforts are creating real success.

☑ Designation of roles

We’ve defined who will be responsible for all tasks relating to the local search marketing of our business. We’ve equipped these team members with all necessary permissions, granted access to key documentation, have organized workflows, and have created an environment for documentation of work.

☑ Canonical data

We’ve created a spreadsheet, approved and agreed upon by all major departments, that lists the standardized name, address, phone number, website URL, and hours of operation for each location of the company. Any variant information has been resolved into a single, agreed-upon data set for each location. This sheet has been shared with all stakeholders managing our local business listings, marketing, website and social outreach.

☑ Website optimization

Our keyword research findings are reflected in the tags and text of our website, including image optimization. Complete contact information for each of our locations is easily accessible on the site and is accurate. We’ve implemented proper markup, such as Schema or JSON-LD, to ensure that our data is as clear as possible to search engines.

☑ Website quality

Our website is easy to navigate and provides a good, usable experience for desktop, mobile and tablet users. We understand that the omni-channel search environment includes ambient search in cars, in homes, via voice. Our website doesn’t rely on technologies that exclude search engines or consumers. We’re putting our customer first.

☑ Tracking and analysis

We’ve implemented maximum controls for tracking and analyzing traffic to our website. We’re also ready to track and analyze other forms of marketing, such as clicks stemming from our Google My Business listings traffic being driven to our website by articles on third party sources, and content we’re sharing via social media.

☑ Publishing strategy

Our website features strong basic pages (Home, Contact, About, Testimonials/Reviews, Policy), we’ve built an excellent, optimized page for each of our core products/services and a quality, unique page for each of our locations. We have a clear strategy as to ongoing content publication, in the form of blog posts, white papers, case studies, social outreach, and other forms of content. We have plans for hyperlocalizing content to match regional culture and needs.

☑ Store locator

We’ve implemented a store locator widget to connect our website’s users to the set of location landing pages we’ve built to thoughtfully meet the needs of specific communities. We’ve also created an HTML version of a menu linking to all of these landing pages to ensure search engines can discover and index them.

☑ Local link building

We’re building the authority of our brand via the links we earn from the most authoritative sources. We’re actively seeking intelligent link building opportunities for each of our locations, reflective of our industry, but also of each branch’s unique geography.

☑ Guideline compliance

We’ve assessed that each of the locations our business plans to build local listings for complies with the Guidelines for Representing Your Business on Google. Each location is a genuine physical location (not a virtual office or PO box) and conducts face-to-face business with consumers, either at our locations or at customers’ locations. We’re compliant with Google’s rules for the naming of each location, and, if appropriate, we understand how to handle listing multi-department and multi-practitioner businesses. None of our Google My Business listings is at risk for suspension due to basic guideline violations. We’ve learned how to avoid every possible local SEO pitfall.

☑ Full Google My Business engagement

We’re making maximum use of all available Google My Business features that can assist us in achieving our goals. This could include Google Posts, Questions & Answers, Reviews, Photos, Messaging, Booking, Local Service Ads, and other emerging features.

☑ Local listing development

We’re using software like Moz Local to scale creation of our local listings on the major aggregators (Infogroup, Acxiom, Localeze and Factual) as well as key directories like Superpages and Citysearch. We’re confident that our accurate, consistent data is being distributed to these most important platforms.

☑ Local listing monitoring

We know that local listings aren’t a set-and-forget asset and are taking advantage of the ongoing monitoring SaaS provides, increasing our confidence in the continued accuracy of our data. We’re aware that, if left unmanaged, local business listing data can degrade over time, due to inputs from various, non-authoritative third parties as well as normal data flow across platforms.

☑ In-store strategy

All public-facing staff are equipped with the necessary training to implement our brand’s customer service policy, answer FAQs or escalate them via a clear hierarchy, resolving complaints before they become negative online reviews. We have installed in-store signage or other materials to actively invite consumer complaints in-person, via an after-hours helpline or text message to ensure we are making maximum effort to build and defend our strong reputation.

☑ Review acquisition

We’ve developed a clear strategy for acquiring reviews on an ongoing basis on the review sites we’ve deemed to be most important to our brand. We’re compliant with the guidelines of each platform on which we’re earning reviews. We’re building website-based reviews and testimonials, too.

☑ Review monitoring & response

We’re monitoring all incoming reviews to identify both positive and negative emerging sentiment trends at specific locations and we’re conversant with Net Promoter Score. We’ve created a process for responding with gratitude to positive reviews. We’re defending our reputation and revenue by responding to negative reviews in ways that keep customers who complain instead of losing them, to avoid needless drain of new customer acquisition spend. Our responses are building a positive impression of our brand. We’ve built or acquired solutions to manage reviews at scale.

☑ Local PR

Each location of our brand has been empowered to build a local footprint in the community it serves, customizing outreach to match community culture. We’re exploring sponsorships, scholarships, workshops, conferences, news opportunities, and other forms of participation that will build our brand via online links and social mentions as well as offline WOM marketing. We’re continuously developing cohesive online/offline outreach for maximum impact on brand recognition, rankings, reputation, and revenue.

☑ Social media

We’ve identified the social platforms that are most popular with our consumer base and a best fit for our brand. We’re practicing ongoing social listening to catch and address positive and negative sentiment trends as they arise. We’ve committed to a social mindset based on sharing rather than the hard sell.

☑ Spam-ready

We’re aware that our brand, our listings, and our reviews may be subject to spam, and we know what options are available for reporting it. We’re also prepared to detect when the spammy behaviors of competitors (such as fake addresses, fake negative/positive reviews, or keyword stuffing of listings) are giving them an unfair advantage in our markets, and have a methodology for escalating reports of guideline violations.

☑ Paid media

We’re investing wisely in both on-and-offline paid media and carefully tracking and analyzing the outcomes of online pay-per-click, radio, TV, billboards, and phone sales strategy. We’re exploring new opportunities, as appropriate and as they emerge, like Google Local Service Ads.

☑ Build/buy

When any new functionality (like Google Posts or Google Q&A) needs to be managed at scale, we have a process for determining whether we need to build or acquire new technology. We know we have to weigh the pros/cons of developing in-house or buying ready-made solutions.

☑ Competitive difference-maker

Once you’ve checked off all of the above elements, you’re ready to move forward towards identifying a USP for your brand that no one else in your market has explored. Be it a tool, widget, app, video marketing campaign, newsworthy acquisition, new partnership, or some other asset, this venture will require deep competitive and market research to discover a need that has yet to be filled well by your competitors. If your business can serve this need, it can set your brand apart for years to come.

Free advice, specifically for local enterprises

It’s asserted that customers may forget what you say, but they’ll never forget how you make them feel.

Call me a Californian, but I continue to be amazed by automotive TV spots that show large trucks driving through beautiful creeks (thanks for tearing up precious riparian habitat during our state-wide drought) and across pristine arctic snowfields (instantly reminding me of climate change). Meanwhile, my family have become Tesla-spotters, seeing that “zero emissions” messaging on the tail of every luxury eco-vehicle that passes us by. As consumers, we know how we feel.

Technical and organizational considerations aside, this is where I see one of the greatest risks posed to the local enterprise structure. Insensitivity at a regional or hyperlocal level — the failure to research customer needs with the intention of meeting them — has been responsible for some of the most startling bad news for enterprises in recent recall. From ignored negative reviews across fast food franchises, to the downsizing of multiple apparel retailers who have been unable to stake a clear claim in the shifting shopping environment, brands that aren’t successful at generating positive consumer “feelings” may need to reevaluate not just their local search marketing mindset, but their basic identity.

If this sounds uncomfortable or risky, consider that we are seeing a rising trend in CEOs taking stands on issues of national import in America. This is about feelings. Consumers are coming to expect this, and it feeds down to the local level.

Hyperlocalized market research

If your brand is considering opening a new branch in a new state or city, you’ll be creating profiles as part of your research. These could be based on everything from reading local news to conducting formal surveys. If I were to do something like this for my part of California, these are the factors I’d be highlighting about the region:

California

Enterprises

We’ve been blasted by drought and wildfire. In 2017, alone, we went through 9,133 fires. On a positive note, Indigenous thought-leadership is beginning to be re-implemented in some areas to solve our worst ecological problems (water scarcity, salmon loss, absence of traditional forestry practices).

Can your brand help conserve water, re-house thousands of homeless residents, fund mental health services despite budget cuts, make legal services affordable, provide solutions for increased future safety? What are your green practices? Are you helping to forward ecological recovery efforts at a tribal, city or state level?

We’re grumbling more loudly about tech gentrification. If you live in Mississippi, sit down for this. The average home price in your state is $199,028. In my part of California, it’s $825,000. In San Francisco, specifically, you’ll need $1.2 million dollars to buy a tiny studio apartment… if you can find one. While causes are complex, people I talk with generally blame Silicon Valley.

Can your brand be part of this conversation? If not, you’re not really addressing what is on statewide consumers’ minds. Particularly if you’re marketing a tech-oriented company, taking the housing crisis seriously and coming up with solutions for even a modest amount of relief would certainly be positive and newsworthy.

We’ve turned to online shopping for an interesting variety of reasons. And it’s not just because we’re techie hipsters. The retail inventory in big cities (San Francisco) can be overwhelming to sort through, and in small towns (Cloverdale), the shopping options are too few to meet our basic and luxury desires.

Can your brand thrive in the gaps? If you’re located in a metro area, you may need to offer personal assistance to help consumers filter through options. If you’ve got a location somewhere near small towns, strategies like same-day delivery could help you remain competitive.

We’ve got our Hispanic/Latino identity back. Our architecture, city and street names are daily reminders that California has a lot more to do with Mexico than it ever did with the Mayflower. We may have become part of the U.S. in 1850, but pay more attention to 2014 — the year that our Hispanic/Latino community became the state’s largest ethnic group. This is one of the most vibrant happenings here. At the same time, our governor has declared us a sanctuary state for immigrants, and we’re being sued for it by the Justice Department.

Can your brand celebrate our state’s diversity? If you’re doing business in California today, you’ll need bilingual marketing, staff, and in-store amenities. Pew Research publishes ongoing data about the Hispanic/Latino segment of our population. What is your brand doing to ensure that these customers feel truly served?

We’re politically diverse. Our single state is roughly the same size as Sweden, and we truly do run the political gamut from A–Z here. Are citizens removing a man-made dam heroically restoring ecology or getting in the way of commerce? You’ll find voices on every side.

Can your brand take the risk of publicizing its honest core values? If so, you are guaranteed to win and lose Californian customers, so do your research and be prepared to own your stance. Know that at a regional level, communities differ greatly. Those TV ads that show trucks running roughshod through fragile ecosystems may fly in some cities and be viewed with extreme distaste in others.

Money is top of mind. More than ⅓ of Californians have zero savings. Over½ of the citizens have less than $1000 in savings. We invest more in Welfare than the next two states combined. And while our state has the highest proportion of resident billionaires, they are vastly outnumbered by citizens who are continuously anxious about struggling to get by. Purchasing decisions are seldom easy.

Can your brand employ a significant number of residents and pay them a living wage? Could your entry into a new market lift poverty in a town and provide better financial security? This would be newsworthy! Have ideas for lowering prices? You’ll get some attention there, too.

Obviously, I’m painting with broad strokes here, just touching on some of the key points that your enterprise would need to consider in determining to commence operations in any city or state. Why does this matter? Because the hyperlocalization of marketing is on the rise, and to engage with a community, you must first understand it.

Every month, I see businesses shutter because someone failed to apprehend true local demand. Did that bank pick a good location for a new branch? Yes — the next branch is on the other side of the city. Will the new location of the taco franchise remain open? No — it’s already sitting empty while the beloved taco wagon down the street has a line that spills out of its parking lot all night long.

Summing up

“What helps people, helps business.” Leo Burnett

The checklist in this post can help you create an enterprise-appropriate strategy for well-organized local search marketing, and it’s my hope that you’ll evaluate all SEO advice for its fitness to your model. These are the basic necessities. But where you go from there is the exciting part. The creative solutions you find to meet the specific wants and needs of individualized service communities could spell out the longevity of your brand’s success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Advertisements

How to Use Instagram Like a Beauty Brand

Posted by zeehj

Does your brand’s activity on its social accounts impact its search rankings? Maybe. Maybe not. But does it matter anyway?

I shouldn’t have to convince you that investing in a social media for your company is worth it; even in light of Facebook’s recent data breach, we are so reliant upon our social profiles for real human interaction that leaving them is not a real option. In fact, the below statistics from Pew Research Center’s 2018 Social Media Use Survey indicate that we’re not going to give up our social media profiles any time soon.

Humans are social creatures. It makes sense that we love being on social networking sites. We crave interaction with fellow humans. We’re also highly likely to trust the recommendations of our friends and family (Nielsen) and those recommendations often influence our purchasing decisions. We ask our loved ones for advice on where to put our dollars in myriad ways, all at different price points:

  • What coffee shop do you like to go to?
  • Which mascara is that?
  • What are you reading right now?
  • Where’d you get that tie?
  • What neighborhoods are you looking to move to?
  • What schools are you looking to send Anna to?

Yes, those same searches occur online. They also frequently occur in tandem with testimonials from the people in our lives (depending on how thorough we want or need to be).

So if you have a thing that you want to sell to a group of people and you’re still not pursuing a social strategy, I don’t understand what you’re doing. Yes, it’s 2018 and I still find myself trying to persuade clients to proactively use (the right) social networks to promote their brand.

For the sake of this piece, we’re going to focus on organic usage (read: free, not paid advertising) of Instagram. Why just Instagram? 35% of US adults say they use Instagram as of 2018, up from 28% in 2016. This was the greatest growth across top social networking sites reported by Pew Research Center. Additionally, its 35% usage puts it at the third most popular social networking platform, behind only Facebook and YouTube.

Other good news? It may be easier for brands’ posts to appear in users’ Instagram feeds than on their Facebook feeds: Facebook still wants to prioritize your family, friends and groups, while The New York Times reports that Instagram is updating its algorithm to favor newer posts rather than limit the accounts in your feed.

So should every brand have an Instagram? Maybe? But notice I’ve been primarily using the word “brand,” not “company” or “business.” That’s deliberate. Companies (only) provide customers with a service or sell a product. Brands provide customers (followers) with an identity. (If you want to dive further into this, I highly recommend this presentation by former Distiller Hannah Smith.)

The best companies are brands: they’ve got identities with which consumers align themselves. We become loyal to them. We may even use the brands we purchase from and follow as self identifiers to other people (“I’m a Joe & the Juice kind of guy, but not Starbucks,” “I never use MAC, only NARS,” “Me, shop at Banana Republic?! I only go to Everlane!”). Not every company should be on Instagram — it doesn’t make much sense for B2Bs to invest time and energy into building their company’s presence on Instagram.

Instagram is not for your consulting firm. And probably not for your SaaS company, either (but prove me wrong)!

It’s for celebrities. It’s to show off your enviable trip. It’s for fashion blogs. Sneakerheads. Memes. Art. Beauty brands. It’s really great for beauty brands. Why? Instagram is obviously great for sharing pretty photos — and if you’re a beauty company, well, it’s a no-brainer that you should have an active account. And it also has incredible built-in features to organically promote your posts, engage customers, and sell products with actual links to those products on your photos.

So, if you’re going to use Instagram, do it right. If you want to do it right, do it like a beauty brand.

First things first: Why do beauty companies’ IG posts look better?

Glossier

Onomie

Milk

Let’s get the obvious out of the way: each account features beautiful models, pretty sceneries, and cosmetics in clean packaging. That said, it’s not just the subject of the IG photos that matters: each of these IG accounts’ photos have been curated and edited together, so that their photos look cohesive when you view them in IG’s grid format. How do they do that? Let’s look at three posts from these accounts.

Glossier

Onomie

Milk

It’s hard (for me) to pick apart precisely why these photos are aesthetically pleasing — and it doesn’t help that I’m neither a photographer, nor a designer. That said, here is my rudimentary, non-designer take on why these photos look great together:

#1: Their subjects are beautiful (duh)

#2: There are limited primary focal points, and tons of negative space (though the medicine cabinet and floral arrangement photos are arguably “busy”)

#3: Their hues are complementary (pinky-pearlescent-pastels, anyone?)

There’s a lot of pink. And white. And pastels. And more pink. And then, occasionally, pops of color (think: a new violet lipstick shade).

Color schemes remain consistent across Onomie’s, Milk’s, and Glossier’s photos — these beauty brands don’t suddenly change their color palettes from one photo to the next. In fact, they are most likely implementing the same Instagram filters for each photo, or at least editing the color balances so that the photos complement each other. They are deliberately catering to Instagram’s 3×3 grid photo format (or 3×4, or 3×5, depending on your screen size). While many users do see IG posts in their “feeds” when they open the app, users are still motivated to look at IG accounts’ for a number of reasons: IG profiles are the only place where you can add hyperlinks on Instagram, and is also where accounts can pin stories for users to revisit.

But how on earth do they do it? They may have professional photographers, or graphic designers they can beg to normalize their color balances across photos. However, I don’t think that most companies necessarily need this mastery in-house in order to have an Instagram profile that looks good to mere mortals.

What I can assure you is that they plan, plan, plan out their posts in advance. In order to do this effectively, of course, you need the right tools. Here’s your starter pack of IG apps:

  • VSCO
    • Freemium phone app
    • Enables you to edit photos like a master — VSCO goes way beyond a small set of filters
    • Has its own community and image feed within the app, separate from IG
    • VSCO can’t post directly to IG (yet), but you can easily download any edited photo
  • Planoly
    • Freemium desktop tool and phone app
    • Can visualize your photos in a grid format with your other IG photos
    • Built-in analytics
    • Can schedule and post directly to IG, with captions and hashtags
  • Unum
    • Free
    • Offers some photo editing tools
    • Can drag and drop photos to plan out how they will appear alongside your other uploads, in grid format
    • Can post to IG, but no scheduling features

This may sound like a lot of work, and for non-designers in particular it’s pretty challenging. That said, the fruits of your labor can be used again and again. In fact, that’s precisely what these beauty brands do on IG: if they’re featuring a product (again, hello lipstick shades), they show off that product’s different colors, on different skintones. Basically, rinse and repeat with your IG photos: this repetition is great for those with sparse content calendars, and still looks great.

Okay, but they’re not popular just because of their looks, right? Why are beauty brands on IG so damn popular?

Yes, looks matter. IG is a visual platform. Sorry not sorry. And yes, we’re talking about beauty brands that have budgets to advertise their accounts and products on IG, which also contributes to their popularity. However, that’s not the whole story.

They use hashtags and photo tags.

Hashtags

Just like on Twitter (and Facebook, to a degree), hashtags are a natural way to boost exposure and get “discovered.” That’s largely because IG users can also follow hashtags, in the same manner as following a handle. And, just like on Twitter, it matters which hashtags you use. IG also allows users to add up to 30 hashtags per post — and yes, this can look spammy, but if you’re using IG like a beauty brand, you’ll separate your caption from your hashtags with periods-used-as-line-breaks or as a separate comment after you post.

So, where should you begin hunting for hashtags? Unfortunately, the Cambridge Analytica debacle has extended to Facebook’s other properties, including Instagram. It seems like one direct response to this is to limit the number of API calls we can make of IG. This means awesome services like websta.me can’t serve up the same amount of information around hashtags as they once did.

That said, Tagboard is one option for content and social media marketers to use. I like to use it to suss out hashtag intent (in answering whether this the right hashtag to use for this post). *Readers: if you’ve got tools you love to find hashtags on IG, add them in the comments below for us, please!

Otherwise, your best bet (as far as I know) is to search for hashtags directly in Instagram’s Discover area, under Tags. There, you can see how many times those hashtags have been used (what’s popular?) and then click through to see what photos have been tagged.

Photo tags

Beauty brands also take advantage of photo tagging on their posts when they can: if they are featuring a celebrity (like the magnificent Tracee Ellis Ross), they can tag her IG directly onto this post. Not only does this let Tracee (or, more likely, her social media manager) know, but depending on her settings this photo now shows up under her tagged photos on her profile — for her fans to discover.

Similarly, if you’re a business selling products and you’ve been approved for shopping on IG, you can also tag your products in your photos so that users can click through directly to their product pages. This is a no-brainer. Just do it.

They talk to their followers.

We already know that it’s best practice to engage and respond to followers on social media (within reason), and IG is no different. Onomie, Milk and Glossier all have downright spirited conversations in their photos’ comments sections by prompt fellow ‘grammers to participate in a few ways. They:

They add stories.

IG’s “Stories” feature is another great tool that Onomie, Milk, and Glossier all use. They’re like IG posts, but ephemeral (they only last 24 hours) and do not live in your main feed: users can access these stories from the top of their IG feeds, and from the account’s main icon. In some cases — especially brands selling products — these accounts may choose to “pin” evergreen stories to their IG profiles, so that users can access them beyond the 24-hour lifespan.

Stories are an excellent way to gather additional insights from followers (outside of comments) because you can run polls (with clickable elements) to collect simple data (“Should our next product help alleviate dry or oily skin?”). What’s more is that, depending on users’ notification preferences, stories automatically push notifications to followers’ phone screens. This means that even if a user is not using the app, they will be notified of new, temporary content.

If your brand (or your client) isn’t taking advantage of IG’s great marketing tools, it’s time to stop waiting and get ‘gramming. Especially if your target audiences are using the platform, there is no reason not to test out all the ways it allows you to engage its community.

Share your favorite IG tools, tips, and accounts below, so that other Moz readers can get inspired. And if you’re passionate about marketing, come join our team, and help me convince more awesome brands to take over Instagram. (JK. Kinda.)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Sustainable Link Building: Increasing Your Chances of Getting Links – Whiteboard Friday

Posted by Paddy_Moogan

Link building campaigns shouldn’t have a start-and-stop date — they should be ongoing, continuing to earn you links over time. In this edition of Whiteboard Friday, please warmly welcome our guest host Paddy Moogan as he shares strategies to achieve sustainable link building, the kind that makes your content efforts lucrative far beyond your initial campaigns for them.

https://fast.wistia.net/embed/iframe/hoboakgnsv?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Sustainable Link Building: Increasing Your Chances of Getting Links

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Moz fans. Welcome to Whiteboard Friday. I’m not Rand. I’m Paddy Moogan. I’m the cofounder of Aira. We’re an agency in the UK, focusing on SEO, link building, and content marketing. You may have seen me write on the Moz Blog before, usually about link building. You may have read my link building book. If you have, thank you. Today, I’m going to talk about link building again. It’s a topic I love, and I want to share some ideas around what I’m calling “sustainable link building.”

Problems

Now, there are a few problems with link building that make it quite risky, and I want to talk about some problems first before giving you some potential solutions that help make your link building less risky. So a few problems first:

I. Content-driven link building is risky.

The problem with content-driven link building is that you’re producing some content and you don’t really know if it’s going to work or not. It’s quite risky, and you don’t actually know for sure that you’re going to get links.

II. A great content idea may not be a great content idea that gets links.

There’s a massive difference between a great idea for content and a great idea that will get links. Knowing that difference is really, really important. So we’re going to talk a little bit about how we can work that out.

III. It’s a big investment of time and budget.

Producing content, particularly visual content, doing design and development takes time. It can take freelancers. It can take designers and developers. So it’s a big investment of time and budget. If you’re going to put time and budget into a marketing campaign, you want to know it’s probably going to work and not be too risky.

IV. Think of link building as campaign-led: it starts & stops.

So you do a link building campaign, and then you stop and start a new one. I want to get away from that idea. I want to talk about the idea of treating link building as the ongoing activity and not treating it as a campaign that has a start date and a finish date and you forget about it and move on to the next one. So I’m going to talk a little bit about that as well.

Solutions

So those are some of the problems that we’ve got with content-driven link-building. I want to talk about some solutions of how to offset the risk of content-driven link building and how to increase the chances that you’re actually going to get links and your campaign isn’t going to fail and not work out for you.

I. Don’t tie content to specific dates or events

So the first one, now, when you coming up with content ideas, it’s really easy to tie content ideas into events or days of the year. If there are things going on in your client’s industry that are quite important, current festivals and things like that, it’s a great way of hooking a piece of content into an event. Now, the problem with that is if you produce a piece of content around a certain date and then that date passes and the content hasn’t worked, then you’re kind of stuck with a piece of content that is no longer relevant.

So an example here of what we’ve done at Aira, there’s a client where they launch a piece of content around the Internet of Things Day. It turns out there’s a day celebrating the Internet of Things, which is actually April 9th this year. Now, we produced a piece of content for them around the Internet of Things and its growth in the world and the impact it’s having on the world. But importantly, we didn’t tie it exactly to that date. So the piece itself didn’t mention the date, but we launched it around that time and that outreach talked about Internet of Things Day. So the outreach focused on the date and the event, but the content piece itself didn’t. What that meant was, after July 9th, we could still promote that piece of content because it was still relevant. It wasn’t tied in with that exact date.

So it means that we’re not gambling on a specific event or a specific date. If we get to July 9th and we’ve got no links, it obviously matters, but we can keep going. We can keep pushing that piece of content. So, by all means, produce content tied into dates and events, but try not to include that too much in the content piece itself and tie yourself to it.

II. Look for datasets which give you multiple angles for outreach

Number two, lots of content ideas can lead from data. So you can get a dataset and produce content ideas off the back of the data, but produce angles and stories using data. Now, that can be quite risky because you don’t always know if data is going to give you a story or an angle until you’ve gone into it. So something we try and do at Aira when trying to produce content around data is from actually different angles you can use from that data.

So, for example:

  • Locations. Can you pitch a piece of content into different locations throughout the US or the UK so you can go after the local newspapers, local magazines for different areas of the country using different data points?
  • Demographics. Can you target different demographics? Can you target females, males, young people, old people? Can you slice the data in different ways to approach different demographics, which will give you multiple ways of actually outreaching that content?
  • Years. Is it updated every year? So it’s 2018 at the moment. Is there a piece of data that will be updated in 2019? If there is and it’s like a recurring annual thing where the data is updated, you can redo the content next year. So you can launch a piece of content now. When the data gets updated next year, plug the new data into it and relaunch it. So you’re not having to rebuild a piece of a content every single time. You can use old content and then update the data afterwards.

III. Build up a bank of link-worthy content

Number three, now this is something which is working really, really well for us at the moment, something I wanted to share with you. This comes back to the idea of not treating link building as a start and stop campaign. You need to build up a bank of link-worthy content on your client websites or on your own websites. Try and build up content that’s link worthy and not just have content as a one-off piece of work. What you can do with that is outreach over and over and over again.

We tend to think of the content process as something like this. You come up with your ideas. You do the design, then you do the outreach, and then you stop. In reality, what you should be doing is actually going back to the start and redoing this over and over again for the same piece of content.

What you end up with is multiple pieces of content on your client’s website that are all getting links consistently. You’re not just focusing on one, then moving past it, and then working on the next one. You can have this nice big bank of content there getting links for you all the time, rather than forgetting about it and moving on to the next one.

IV. Learn what content formats work for you

Number four, again, this is something that’s worked really well for us recently. Because we’re an agency, we work with lots of different clients, different industries and produce lots and lots of content, what we’ve done recently is try to work out what content formats are working the best for us. Which formats get the best results for our clients? The way we did this was a very, very simple chart showing how easy something was versus how hard it was, and then wherever it was a fail in terms of the links and the coverage, or wherever it was a really big win in terms of links and coverage and traffic for the client.

Now, what you may find when you do this is certain content formats fit within this grid. So, for example, you may find that doing data viz is actually really, really hard, but it gets you lots and lots of links, whereas you might find that producing maps and visuals around that kind of data is actually really hard but isn’t very successful.

Identifying these content formats and knowing what works and doesn’t work can then feed into your future content campaign. So when you’re working for a client, you can confidently say, “Well, actually, we know that interactives aren’t too difficult for us to build because we’ve got a good dev team, and they really likely to get links because we’ve done loads of them before and actually seen lots of successes from them.” Whereas if you come up with an idea for a map that you know is actually really, really hard to do and actually might lead to a big fail, then that’s not going to be so good, but you can say to a client, “Look, from our experience, we can see maps don’t work very well. So let’s try and do something else.”

That’s it in terms of tips and solutions for trying to make your link building more sustainable. I’d love to hear your comments and your feedback below. So if you’ve got any questions, anything you’re not sure about, let me know. If you see it’s working for your clients or not working, I’d love to hear that as well. Thank you.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Want to Speak at MozCon 2018? Here’s Your Chance – Pitch to Be a Community Speaker!

Posted by Danielle_Launders

MozCon 2018 is nearing and it’s almost time to brush off that microphone. If speaking at MozCon is your dream, then we have the opportunity of a lifetime for you! Pitch us your topic and you may be selected to join us as one of our six community speakers.

What is a community speaker, you ask? MozCon sessions are by invite only, meaning we reach out to select speakers for the majority of our talks. But every year we reserve six 15-minute community speaking slots, where we invite anyone in the SEO community to pitch to present at MozCon. These sessions are both an attendee favorite and a fabulous opportunity to break into the speaking circuit.

Katie Cunningham, one of last year’s community speakers, on stage at MozCon 2017

Interested in pitching your own idea? Read on for everything you need to know:

The details

  • Fill out the community speaker submission form
  • Only one submission per person — make sure to choose the one you’re most passionate about!
  • Pitches must be related to online marketing and for a topic that can be covered in 15 minutes
  • Submissions close on Sunday, April 22nd at 5pm PDT
  • All decisions are final
  • All speakers must adhere to the MozCon Code of Conduct
  • You’ll be required to present in Seattle at MozCon

Ready to pitch your idea?

If you submit a pitch, you’ll hear back from us regardless of your acceptance status.

What you’ll get as a community speaker:

  • 15 minutes on the MozCon stage for a keynote-style presentation, followed by 5 minutes of Q&A
  • A free ticket to MozCon (we can issue a refund or transfer if you have already purchased yours)
  • Four nights of lodging covered by Moz at our partner hotel
  • Reimbursement for your travel — up to $500 for domestic and $750 for international travel
  • An additional free MozCon ticket for you to give away, plus a code for $300 off of one ticket
  • An invitation for you and your significant other to join us for the pre-event speakers dinner

The selection process:

We have an internal committee of Mozzers that review every pitch. In the first phase we review only the topics to ensure that they’re a good fit for our audience. After this first phase, we look at the entirety of the pitch to help us get a comprehensive idea of what to expect from your talk on the MozCon stage.

Want some advice for perfecting your pitch?

  • Keep your pitch focused to online marketing. The more actionable the pitch, the better.
  • Be detailed! We want to know the actual tactics our audience will be learning about. Remember, we receive a ton of pitches, so the more you can explain, the better!
  • Review the topics already being presented — we’re looking for something new to add to the stage.
  • Keep the pitch to under 1200 characters. We’re strict with the word limits — even the best pitches will be disqualified if they don’t abide by the rules.
  • No pitches will be evaluated in advance, so please don’t ask 🙂
  • Using social media to lobby your pitch won’t help. Instead, put your time and energy into the actual pitch itself!
  • Linking to a previous example of a slide deck or presentation isn’t required, but it does help the committee a ton.

You’ve got this!

This could be you.

If your pitch is selected, the MozCon team will help you along the way. Whether this is your first time on stage or your twentieth, we want this to be your best talk to date. We’re here to answer questions that may come up and to work with you to deliver something you’re truly proud of. Here are just a handful of ways that we’re here to help:

  • Topic refinement
  • Helping with your session title and description
  • Reviewing any session outlines and drafts
  • Providing plenty of tips around best practices — specifically with the MozCon stage in mind
  • Comprehensive show guide
  • Being available to listen to you practice your talk
  • Reviewing your final deck
  • A full stage tour on Sunday to meet our A/V crew, see your presentation on the big screens, and get a feel for the show
  • An amazing 15-person A/V team

Make your pitch to speak at MozCon!

We can’t wait to see what y’all come up with. Best of luck!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Bot Plan: Your Guide to Making Conversations Convert

Posted by purna_v

Let’s start off with a quick “True or False?” game:

“By 2020, the average person will have more conversations with their bot than with their spouse.”

True, or false? You may be surprised to learn that speaking more with bots than our spouse is precisely what Gartner is predicting.

And when Facebook’s Mark Zuckerberg says “messaging is one of the few things that people do more than social networking,” it requires no leap of faith to see that chatbots are an integral part of marketing’s future.

But you don’t need to stock up on canned peaches and head for the hills because “the robots are coming.” The truth is, the robots aren’t coming because they’re already here, and they love us from the bottom of their little AI-powered hearts.

Bots aren’t a new thing for many parts of the world such as China or India. As reported by Business Insider, sixty-seven percent of consumers worldwide have used a chatbot for customer support in the last year.

Within the United States, an impressive 60% of millennials have used chatbots with 70% of those reporting positive experiences, according to Forbes.

There’s no putting bots back in the box.

And it’s not just that brands have to jump on board to keep up with those pesky new generations, either. Bots are great for them, too.

Bots offer companies:

  1. A revolutionary way to reach consumers. For the first time in history, brands of any size can reach consumers on a personal level. Note my emphasis on “of any size.” You can be a company of one and your bot army can give your customers a highly personal experience. Bots are democratizing business!
  2. Snackable data. This “one-to-one” communication gives you personal insights and specificity, plus a whole feast of snackable data that is actionable.
  3. Non-robot-like interaction. An intelligent bot can keep up with back-and-forth customer messages in a natural, contextual, human way.
  4. Savings. According to Juniper Research, the average time saving per chatbot inquiry compared to traditional call centers is over four minutes, which has the potential to make a truly extraordinary impact on a company’s bottom line (not to mention the immeasurable impact it has on customers’ feelings about the company).
  5. Always on. It doesn’t matter what time zone your customer is in. Bots don’t need to sleep, or take breaks. Your company can always be accessible via your friendly bot.

Here in the West, we are still in the equivalent of the Jurassic Period for bots. What they can be used for is truly limited only by our imagination.

One of my most recent favorites is an innovation from the BBC News Labs and Visual Journalism teams, who have launched a bot-builder app designed to, per Nieman Lab, “make it as easy as possible for reporters to build chatbots and insert them in their stories.”

So, in a story about President Trump from earlier this year, you see this:

Source: BBC.com

It’s one of my favorites not just because it’s innovative and impressive, but because it neatly illustrates how bots can add to and improve our lives… not steal our jobs.

Don’t be a dinosaur

A staggering eighty percent of brands will use chatbots for customer interactions by 2020, according to research. That means that if you don’t want to get left behind, you need to join the bot arms race right now.

“But where do I start?” you wonder.

I’m happy you asked that. Building a bot may seem like an endeavor that requires lots of tech savvy, but it’s surprisingly low-risk to get started.

Many websites allow you to build bots for free, and then there’s QNAMaker.ai (created by Microsoft, my employer), which does a lot of the work for you.

You simply input your company’s FAQ section, and it builds the foundation for an easy chatbot that can be taken live via almost any platform, using natural language processing to parse your FAQ and develop a list of questions your customers are likely to ask.

This is just the beginning — the potential for bots is wow-tastic.

That’s what I’m going to show you today — how you can harness bot-power to build strong, lasting relationships with your customers.

Your 3-step plan to make conversations convert

Step 1: Find the right place to start

The first step isn’t to build a bot straightaway. After all, you can build the world’s most elaborate bot and it is worth exactly nothing to you or your customer if it does not address their needs.

That’s why the first step is figuring out the ways bots can be most helpful to your customers. You need to find their pain points.

You can do this by pretending you’re one of your customers, and navigating through your purchase funnel. Or better again, find data within your CRM system and analytics tools that can help you answer key questions about how your audience interacts with your business.

Here’s a handy checklist of questions you should get answers to during this research phase:

  • How do customers get information or seek help from your company? ☑
  • How do they make a purchase? ☑
  • Do pain points differ across channels and devices? ☑
  • How can we reduce the number of steps in each interaction? ☑

Next, you’ll want to build your hypothesis. And here’s a template to help you do just that:

I believe [type of person] needs to solve [problem] which happens while [situation], which will allow them to [get value].

For example, you’re the manager of a small spa, whose biggest time-suck is people calling to ask simple questions, meaning other customers are on hold for a long time. If those customers can ask a bot these simple questions, you get three important results:

  1. The hold time for customers overall will diminish
  2. The customer-facing staff in your spa will be able to pay more attention to clients who are physically in front of them
  3. Customers with lengthier questions will be helped sooner

Everybody wins.

Finally, now that you’ve identified and prioritized the situations where conversation can help, you’ll be ready to build a bot as well as a skill.

Wait a minute — what’s a skill in this context, and how do they relate to bots? Here’s a great explanation from Chris Messina:

  • A bot is an autonomous program on a network
  • A chatbot is a bot that uses human language to communicate
  • An AI assistant is a chatbot that performs tasks or services for an individual
  • A skill is a capability that an AI assistant can learn

Each of them can help look things up, place orders, solve problems, and make things happen easier, better, and faster.

A few handy resources to build a bot are:

  • Mobile Monkey Facebook Messenger marketing platform
  • Bot users on Slack
  • So You Want to Build a Chat Bot – Here’s How (Complete with Code!)
  • Step 2: Add conversation across the entire customer journey

    There are three distinct areas of the customer decision journey where bots and skills can make a big difference.

    Bot as introducer

    Bots can help your company by being present at the very first event in a purchase path.

    Adidas did this wonderfully when they designed a chatbot for their female-focused community Studio LDN, to help create an interactive booking process for the free fitness sessions offered. To drive engagement further, as soon as a booking was made the user would receive reminders and messages from influencer fitness instructors.

    The chatbot was the only way for people to book these sessions and it worked spectacularly well.

    In the first two weeks, 2,000 people signed up to participate, with repeat use at 80%. Retention after week one was 60%, which the brand claims is far better compared to an app.

    Adidas did something really clever. They advertised the bot across many of their other channels to help promote the bot and help with its discoverability.

    You can do the same.

    There are countless examples where bots can put their best suit on and act as the first introduction to your company:

    • Email marketing: According to MailChimp research, the average email open rates are between 15% to 26% with click rates being just a fraction of that at approximately 2%–5%. That’s pretty low when you compare that to Messenger messages, which can have an open rate of well over 90%. Why not make your call-to-action within your email be an incentive for people to engage with your chatbot? For example, something like “message us for 10% off” could be a compelling reason for people to engage with your chatbot.
    • Social media: How about instead of running Facebook ads which direct people to websites, you run an ad connecting people to bots instead? For example, in the ad, advise people to “chat to see the latest styles” or “chat now to get 20% off” and then have your bot start a conversation. Instant engagement! Plus, it’s a more gentle call-to-action as opposed to a hard sell such as “buy now.”
    • Video: How about creating instructional YouTube videos on how to use your bot? Especially helpful since one of the barriers to using this new technology is a lack of awareness about how to use it. A short, quick video that demonstrates what your skill can do could be very impactful. Check out this great example from FitBit and Cortana:

    • Search: As you’ve likely seen by now, Bing has been integrating chatbots within the SERPs itself. You can do a search for bots across different platforms and you’ll be able to add relevant bots directly to your preferred platform right from the search results themselves:

    Travel Bots

    • You can engage with local businesses such as restaurants via the Bing Business bot that shows up as part of the local listings:

    Monsoon Seattle search with chatbot

    The key lesson here is that when your bot is acting as an introducer, give your audience plenty of ways and reasons to chat. Use conversation to tell people about new stuff, and get them to kick off that conversation.

    Bot as influencer

    To see a bot acting as an effective influencer, let’s turn to Chinese giant Alibaba. They developed a customizable chatbot store concierge that they offer free to brands and markets.

    Cutely named dian xiao mi, or “little shop bee,” the concierge is designed to be the most helpful store assistant you could wish for.

    For example, if a customer interacting with a clothing brand uploads a photograph of a t-shirt, the bot buzzes in with suggestions of pants to match. Or, if a customer provides his height and weight, the bot can offer suggested sizing. Anyone who has ever shopped online for clothing knows exactly how much pain the latter offering could eliminate.

    This helpful style is essentially changing the conversation from “BUY NOW!” to “What do you need right now?”

    We should no longer ask: “How should we sell to customers?” The gazillion-dollar question instead is: How can we connect with them?

    An interesting thing about this change is that, when you think about it for a second, it seems like common sense. How much more trust would you have for a brand that was only trying to help you? If you bought a red dress, how much more helpful would it be if the brand showed you a pic of complementary heels and asked if you want to “complete the look”?

    For the chatbot to be truly helpful as an influencer, it needs to learn from each conversation. It needs to remember what you shared from the last conversation, and use it to shape future conversations.

    So, say a chatbot from my favorite shoe store knew all about my shoe addiction (is there a cure? Would I event want to be cured of it?), then it could be more helpful via its remarketing efforts.

    Imagine how much more effective it would be if we could have an interaction like this:

    Shoestore Chatbot: Hi Purna! We’re launching a new collection of boots. Would you like a sneak peek?

    Me: YES please!!!

    Shoestore Chatbot: Great! I’ll email pics to you. You can also save 15% off your next order with code “MozBlog”. Hurry, code expires in 24 hours.

    Me: *buys all the shoes, obvs*

    This is Bot-topia. Your brand is being helpful, not pushy. Your bot is cultivating relationships with your customers, not throwing ads at them.

    The key lesson here? For your bot to be a successful influencer, you must always consider how they can be helpful and how they can add value.

    Bot as closer

    Bot: “A, B, C. Always be closing.”

    Imagine you want to buy flowers for Mother’s Day, but you have very little interest in flowers, and when you scroll through the endless options on the website, and then a long checkout form, you just feel overwhelmed.

    1-800-Flowers found your pain point, and acted on it by creating a bot for Facebook Messenger.

    It asks you whether you want to select a bunch from one of their curated collections, instantly eliminating the choice paralysis that could see consumers leave the website without purchasing anything.

    And once you’ve chosen, you can easily complete the checkout process using your phone’s payment system (e.g. Apple Pay) to make checkout a cinch. So easy, and so friction-free.

    The result? According to Digiday, within two months of launch the company saw 70% of the orders through the bot came from brand-new customers. By building a bot, 1-800 Flowers slam-dunked their way into the hearts of a whole new, young demographic.

    Can you think of a better, more inexpensive way to unlock a big demographic? I can’t.

    To quote Mr. Zuckerberg again: “It’s pretty ironic. To order from 1-800-Flowers, you never have to call 1-800-Flowers again.”

    Think back to that handy checklist of questions from Step 1, especially this one: “How can we reduce the number of steps in each interaction?”

    Your goal is to make every step easy and empathetic.

    Think of what people would want/need to know to as they complete their tasks. For example, if you’re looking to transfer money from your bank account, the banking chatbot could save you from overdraft fees if it warns you that your account could be overdrawn before you make the transfer.

    The key lesson here: Leverage your bots to remove any friction and make the experience super relevant and empathetic.

    Step 3: Measure the conversation with the right metrics

    One of my favorite quotes around how we view metrics versus how we should view metrics comes from Automat CEO Andy Mauro, who says:

    “Rather than tracking users with pixels and cookies, why not actually engage them, learn about them, and provide value that actually meets their needs?”

    Again, this is common sense once you’ve read it. Of course it makes sense to engage our users and provide value that meets their needs!

    We can do this because the bots and skills give us information in our customers’ own words.

    Here’s a short list of KPIs that you should look at (let’s call it “bot-alytics”):

    • Delivery and open rates: If the bot starts a conversation, did your customer open it?
    • Click rates: If your bot delivered a link in a chat, did your customer click on it?
    • Retention: How often do they come back and chat with you?
    • Top messages: What messages are resonating with your customers more than others?
    • Conversion rates: Do they buy?
    • Sentiment analysis: Do your customers express happiness and enthusiasm in their conversation with the bot, or frustration and anger?

    Using bot-alytics, you can easily build up a clear picture of what is working for you, and more importantly, what is working for your customer.

    And don’t forget to ask: What can you learn from bot-alytics that can help other channels?

    The future’s bright, the future’s bots

    What were once dumb machines are now smart enough that we can engage with them in a very human way. It presents the opportunity of a generation for businesses of all shapes and sizes.

    Our customers are beginning to trust bots and digital personal assistants for recommendations, needs, and more. They are the friendly neighborhood machines that the utopian vision of a robotic future presents. They should be available to people anywhere: from any device, in any way.

    And if that hasn’t made you pencil in a “we need to talk about bots” meeting with your company, here’s a startling prediction from Accenture. They believe that in five years, more than half of your customers will select your services based on your AI instead of your traditional brand.

    In three steps, you can start your journey toward bot-topia and having your conversations convert. What are you waiting for?

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    How the Mobile-First Index Disrupts the Link Graph

    Posted by rjonesx.

    It’s happened to all of us. You bring up a webpage on your mobile device, only to find out that a feature you were accustomed to using on desktop simply isn’t available on mobile. While frustrating, it has always been a struggle for web developers and designers alike to simplify and condense their site on mobile screens without needing to strip features or content that would otherwise clutter a smaller viewport. The worst-case scenario for these trade-offs is that some features would be reserved for desktop environments, or perhaps a user might be able to opt out of the mobile view. Below is an example of how my personal blog displays the mobile version using a popular plugin by ElegantThemes called HandHeld. As you can see, the vast page is heavily stripped down and is far easier to read… but at what cost? And at what cost to the link graph?

    My personal blog drops 75 of the 87 links, and all of the external links, when the mobile version is accessed. So what happens when the mobile versions of sites become the primary way the web is accessed, at scale, by the bots which power major search engines?

    Google’s announcement to proceed with a mobile-first index raises new questions about how the link structure of the web as a whole might be influenced once these truncated web experiences become the first (and sometimes only) version of the web Googlebot encounters.

    So, what’s the big deal?

    The concern, which no doubt Google engineers have studied internally, is that mobile websites often remove content and links in order to improve user experience on a smaller screen. This abbreviated content fundamentally alters the link structure which underlies one of the most important factors in Google’s rankings. Our goal is to try and understand the impact this might have.

    Before we get started, one giant unknown variable which I want to be quick to point out is we don’t know what percentage of the web Google will crawl with both its desktop and mobile bots. Perhaps Google will choose to be “mobile-first” only on sites that have historically displayed an identical codebase to both the mobile and desktop versions of Googlebot. However, for the purposes of this study, I want to show the worst-case scenario, as if Google chose not only to go “mobile-first,” but in fact to go “mobile-only.”

    Methodology: Comparing mobile to desktop at scale

    For this brief research, I decided to grab 20,000 random websites from the Quantcast Top Million. I would then crawl two levels deep, spoofing both the Google mobile and Google desktop versions of Googlebot. With this data, we can begin to compare how different the link structure of the web might look.

    Homepage metrics

    Let’s start with some descriptive statistics of the home pages of these 20,000 randomly selected sites. Of the sites analyzed, 87.42% had the same number of links on their homepage regardless of whether the bot was mobile- or desktop-oriented. Of the remaining 12.58%, 9% had fewer links and 3.58% had more. This doesn’t seem too disparate at first glance.

    Perhaps more importantly, only 79.87% had identical links on the homepage when visited by desktop and mobile bots. Just because the same number of links were found didn’t mean they were actually the same links. This is important to take into consideration because links are the pathways which bots use to find content on the web. Different paths mean a different index.

    Among the homepage links, we found a 7.4% drop in external links. This could mean a radical shift in some of the most important links on the web, given that homepage links often carry a great deal of link equity. Interestingly, the biggest “losers” as a percentage tended to be social sites. In retrospect, it seems reasonable that one of the common types of links a website might remove from their mobile version would be social share buttons because they’re often incorporated into the “chrome” of a page rather than the content, and the “chrome” often changes to accommodate a mobile version.

    The biggest losers as a percentage in order were:

    1. linkedin.com
    2. instagram.com
    3. twitter.com
    4. facebook.com

    So what’s the big deal about 5–15% differences in links when crawling the web? Well, it turns out that these numbers tend to be biased towards sites with lots of links that don’t have a mobile version. However, most of those links are main navigation links. When you crawl deeper, you just find the same links. But those that do deviate end up having radically different second-level crawl links.

    Second-level metrics

    Now this is where the data gets interesting. As we continue to crawl out on the web using crawl sets that are influenced by the links discovered by a mobile bot versus a desktop bot, we’ll continue to get more and more divergent results. But how far will they diverge? Let’s start with size. While we crawled an identical number of home pages, the second-tier results diverged based on the number of links found on those original home pages. Thus, the mobile crawlset was 977,840 unique URLs, while the desktop crawlset was 1,053,785. Already we can see a different index taking shape — the desktop index would be much larger. Let’s dig deeper.

    I want you to take a moment and really focus on this graph. Notice there are three categories:

    • Mobile Unique: Blue bars represent unique items found by the mobile bot
    • Desktop Unique: Orange bars represent unique items found by the desktop bot
    • Shared: Gray bars represent items found by both

    Notice also that there are there are four tests:

    • Number of URLs discovered
    • Number of Domains discovered
    • Number of Links discovered
    • Number of Root Linking Domains discovered

    Now here is the key point, and it’s really big. There are more URLs, Domains, Links, and Root Linking Domains unique to the desktop crawl result than there are shared between the desktop and mobile crawler. The orange bar is always taller than the gray. This means that by just the second level of the crawl, the majority of link relationships, pages, and domains are different in the indexes. This is huge. This is a fundamental shift in the link graph as we have come to know it.

    And now for the big question, what we all care about the most — external links.

    A whopping 63% of external links are unique to the desktop crawler. In a mobile-only crawling world, the total number of external links was halved.

    What is happening at the micro level?

    So, what’s really causing this huge disparity in the crawl? Well, we know it has something to do with a few common shortcuts to making a site “mobile-friendly,” which include:

    1. Subdomain versions of the content that have fewer links or features
    2. The removal of links and features by user-agent detecting plugins

    Of course, these changes might make the experience better for your users, but it does create a different experience for bots. Let’s take a closer look at one site to see how this plays out.

    This site has ~10,000 pages according to Google and has a Domain Authority of 72 and 22,670 referring domains according to the new Moz Link Explorer. However, the site uses a popular WordPress plugin that abbreviates the content down to just the articles and pages on the site, removing links from descriptions in the articles on the category pages and removing most if not all extraneous links from the sidebar and footer. This particular plugin is used on over 200,000 websites. So, what happens when we fire up a six-level-deep crawl with Screaming Frog? (It’s great for this kind of analysis because we can easily change the user-agent and restrict settings to just crawl HTML content.)

    The difference is shocking. First, notice that in the mobile crawl on the left, there is clearly a low number of links per page and that number of links is very steady as you crawl deeper through the site. This is what produces such a steady, exponential growth curve. Second, notice that the crawl abruptly ended at level four. The site just didn’t have any more pages to offer the mobile crawler! Only ~3,000 of the ~10,000 pages Google reports were found.

    Now, compare this to the desktop crawler. It explodes in pages at level two, collecting nearly double the total pages of the mobile crawl at this level alone. Now, recall the graph before showing that there were more unique desktop pages than there were shared pages when we crawled 20,000 sites. Here is confirmation of exactly how it happens. Ultimately, 6x the content was made available to the desktop crawler in the same level of crawl depth.

    But what impact did this have on external links?

    Wow. 75% of the external, outbound links were culled in the mobile version. 4,905 external links were found in the desktop version while only 1,162 were found in the mobile. Remember, this is a DA 72 site with over twenty thousand referring domains. Imagine losing that link because the mobile index no longer finds the backlink. What should we do? Is the sky falling?

    Take a deep breath

    Mobile-first isn’t mobile-only

    The first important caveat to all this research is that Google isn’t giving up on the desktop — they’re simply prioritizing the mobile crawl. This makes sense, as the majority of search traffic is now mobile. If Google wants to make sure quality mobile content is served, they need to shift their crawl priorities. But they also have a competing desire to find content, and doing so requires using a desktop crawler so long as webmasters continue to abbreviate the mobile versions of their sites.

    This reality isn’t lost on Google. In the Original Official Google Mobile First Announcement, they write…

    If you are building a mobile version of your site, keep in mind that a functional desktop-oriented site can be better than a broken or incomplete mobile version of the site.

    Google took the time to state that a desktop version can be better than an “incomplete mobile version.” I don’t intend to read too much into this statement other than to say that Google wants a full mobile version, not just a postcard.

    Good link placements will prevail

    One anecdotal outcome of my research was that the external links which tended to survive the cull of a mobile version were often placed directly in the content. External links in sidebars like blog-rolls were essentially annihilated from the index, but in-content links survived. This may be a signal Google picks up on. External links that are both in mobile and desktop tend to be the kinds of links people might click on.

    So, while there may be fewer links powering the link graph (or at least there might be a subset that is specially identified), if your links are good, content-based links, then you have a chance to see improved performance.

    I was able to confirm this by looking at a subset of known good links. Using Fresh Web Explorer, I looked up fresh links to toysrus.com which is currently gaining a great deal of attention due to stores closing. We can feel confident that most of these links will be in-content because the articles themselves are about the relevant, breaking news regarding Toys R Us. Sure enough, after testing 300+ mentions, we found the links to be identical in the mobile and desktop crawls. These were good, in-content links and, subsequently, they showed up in both versions of the crawl.

    Selection bias and convergence

    It is probably the case that popular sites are more likely to have a mobile version than non-popular sites. Now, they might be responsive — at which point they would yield no real differences in the crawl — but at least some percentage would likely be m.* domains or utilize plugins like those mentioned above which truncate the content. At the lower rungs of the web, older, less professional content is likely to have only one version which is shown to mobile and desktop devices alike. If this is the case, we can expect that over time the differences in the index might begin to converge rather than diverge, as my study looked only at sites that were in the top million and only crawled two levels deep.

    Moreover (this one is a bit speculative), but I think over time that there will be convergence between a mobile and desktop index. I don’t think the link graphs will grow exponentially different as the linked web is only so big. Rather, the paths to which certain pages are reached, and the frequency with which they are reached, will change quite a bit. So, while the link graph will differ, the set of URLs making up the link graph will largely be the same. Of course, some percentage of the mobile web will remain wholly disparate. The large number of sites that use dedicated mobile subdomains or plugins that remove substantial sections of content will remain like mobile islands in the linked web.

    Impact on SERPs

    It’s difficult at this point to say what the impact on search results will be. It will certainly not leave the SERPs unchanged. What would be the point of Google making and announcing a change to its indexing methods if it didn’t improve the SERPs?

    That being said, this study wouldn’t be complete without some form of impact assessment. Hat tip to JR Oakes for giving me this critique, otherwise I would have forgotten to take a look.

    First, there are a couple of things which could mitigate dramatic shifts in the SERPs already, regardless of the veracity of this study:

    • A slow rollout means that shifts in SERPs will be lost to the natural ranking fluctuations we already see.
    • Google can seed URLs found by mobile or by desktop into their respective crawlers, thereby limiting index divergence. (This is a big one!)
    • Google could choose to consider, for link purposes, the aggregate of both mobile and desktop crawls, not counting one to the exclusion of the other.

    Second, the relationships between domains may be less affected than other index metrics. What is the likelihood that the relationship between Domain X and Domain Y (more or less links) is the same for both the mobile- and desktop-based indexes? If the relationships tend to remain the same, then the impact on SERPs will be limited. We will call this relationship being “directionally consistent.”

    To accomplish this part of the study, I took a sample of domain pairs from the mobile index and compared their relationship (more or less links) to their performance in the desktop index. Did the first have more links than the second in both the mobile and desktop? Or did they perform differently?

    It turns out that the indexes were fairly close in terms of directional consistency. That is to say that while the link graphs as a whole were quite different, when you compared one domain to another at random, they tended in both data sets to be directionally consistent. Approximately 88% of the domains compared maintained directional consistency via the indexes. This test was only run comparing the mobile index domains to the desktop index domains. Future research might explore the reverse relationship.

    So what’s next?: Moz and the mobile-first index

    Our goal for the Moz link index has always been to be as much like Google as possible. It is with that in mind that our team is experimenting with a mobile-first index as well. Our new link index and Link Explorer in Beta seeks to be more than simply one of the largest link indexes on the web, but the most relevant and useful, and we believe part of that means shaping our index with methods similar to Google. We will keep you updated!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Google Confirms Chrome Usage Data Used to Measure Site Speed

    Posted by Tom-Anthony

    During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

    The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

    Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

    In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

    Google Search Console

    Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

    Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

    John clarified what this graph is showing:

    It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

    And that it is:

    this is the average over all requests for that day

    Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

    For that reason, John points out that:

    Focusing blindly on that number doesn’t make sense.

    With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

    Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

    Googlebot & the Web Rendering Service

    Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

    However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

    Twitter conversation with Gary Ilyes

    At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

    Chrome User Experience Report

    Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

    Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

    In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

    However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

    At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

    The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

    We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

    Pay attention to users

    Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

    The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

    Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

    If you are unsure where to look for site speed advice, then you should look at:

    That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!