Author Archives: S. Ernest Paul

About S. Ernest Paul

S. Ernest Paul is a marketer, an entrepreneur, innovator, with martech expertise and brand, social media, digital marketing and emerging tech. -Top Ten most successful CMOs in 2021 by C-Level Focus. -He was also name in Top 100 in Finance by Finance Magazine. -Staff writer for Medium's 'Data Driven Investor'

Personalization and CDP

The quest to deliver Customer Personalization – Marketing Technology

S. Ernest Paul | Shutterstock

Brands have to identify and understand each and every customer 

Ever been to a restaurant where they know you by name? and the waiter happens to know your preferred drink of choice and like magic brings it to your table while you are getting seated. The ‘Cheers’ kind of experience for those of us who recall the sitcom. These one-on-one personable interactions make experiences special.

We live in a digital world now and demand similar experiences from brands. We want to associate with a brand who knows us. We gravitate towards these brands, rare as they might be. However such a rendezvous brings familiarity & comfort. Trust soon follows. I recently called up my banking institution over the phone, a few taps on the phone initiated by me and there was a friendly voice greeting me personally instantly. Voila! All done with voice print. We expect similar experiences, minimalist, natural, efficient and transferable across devices.

"Patience in today's hyper connected world is on a diet due to clumsy marketing tactics"

What do customers want from brands?

On occasions while dining I have often witnessed all individuals seated at a table nearby simultaneously glued to their mobile phones. Digital and mobile may have taken us all by storm long ago but we do seem to prefer company even if it of the silent variety. This mobile solitude may be amusing to some, not marketers. To marketers, this means real time location data, related contextual information including weather, possibly to trigger a nearby physical store location reminder.

Customers want relevant, meaningful and tailored information and offers from brands who meet their specific needs. Irrelevant offers and emails induce customer paralysis and prove to be counterproductive. In this era of instant gratification and attention span deficit, there is little room for off target customer communications. With a thorough understanding of customer needs, both parties are likely to benefit.

Customers have long declared their expectations. See Figure below:

Delivering on customer expectations enhances customer engagement leading to loyalty for brands, resulting in reduced acquisition costs, a revenue upside and increased retention rates.

The journey begins with data

Brands seeking customer-centric nirvana have to become data-centric, first. None of this degree of hyper-personalization at scale can be accomplished without customer data.

"Data is the new oil. Refining or anticipating customer needs is the beginning of personalization"

This crude oil or rich behavioral customer data, customer interactions, social media activity, demographics, customer life cycle stage recognition and transactional data can be segmented and further sub-segmented. Personalization dividends can be harnessed today without a full blown implementation of technology such as a Customer Data Platform, a CDP. Existing data can be used in cross-sell initiatives, enablement & activation of a few consumer use-cases based on past purchases and behavior. Sometimes using less data is more effective prior to incorporating external purchased data sets.

Incorporation of 2nd & 3rd party external data adds dimensional layers to the refinement process and further enriches customer data. Multiple digital identities, which many consumers use online can be merged into a single record to eliminate redundancy. Iterative cycles of customer behavior and interactions captured via analytics continually refine customer data. Behavior solidarity within customer ‘data sets’ feeds pattern discovery and recognition. With confirmation and accuracy of patterns, machine learning kicks the ‘data sets’ up a notch. With continuous cycles of deep learning & artificial intelligencepredictive analytics begin to unlock future customer behavior. 

How far along is your organization in channeling this new oil, 
piping it, refining it, triaging it, harnessing it to extract
measurable value?

Agile, cross functional teams – marketing, tech experts & operations must work together

Working in silos is kryptonite for personalization. Having a cross functional team in a test and learn mode, sharing insights within a ‘not afraid to fail culture‘ is the best environment to productively deliver, preferably in a war room like setting.

The personalization DNA resides in behavioral data. Its application rests on a thorough mapping and intersection of customer journeys, triggers, devices, events, marketing campaigns and collateral aligned to customer segments with matching behavior. A cross functional team serves this proposition well.

Marketing & Operations realignment

Organizations must look at the personalization ecosystem in its entirety, from data manage­ment to advanced analytics to customer en­gagement, through to measure­ment and optimization. Marketing resources are generally organized by specific skill. For example: Analytics or Campaign management or even by Chan­nel – Social or Search. There is an unintentional siloed ecosystem risk to be wary of. Although optimizing for different elements is import­ant, the whole is greater than the sum of its parts. An understanding of how the different parts interact and how to inte­grate them to support personalization is what differentiates high-performing marketing organizations from poorly performing ones.

Building the customer journey

Grouping customers together with matching needs and behavior is a good place to begin. Armed with a handful of these groupings or segments, align each segment with its own customer journey & map the series of interactions with the company brand. Examples: Visits to a broker, agent, company website, calls to a call center, social media posts, even tracking prospect visits to the company brand’s competitor, etc.

Building customer segments

Hundreds of mini-segments may emerge as a result of combining journeys and customer segments. Each mini-segment may be nuanced and one more valuable than the other. Each should be considered & prioritized by its relative value. For example: Consider a leading insurer who may find it more valuable to engage with its customers who are within their ‘renewal window’ by sending them a reminder that their policy is nearing expiration. Rather than pushing them towards a cross-sell product and risk losing the customer to a competitor, the insurer chooses to send a limited-time policy renewal loyalty-offer.

Harnessing customer signals

The customer provides signals as to their intent with their online & offline interactions. Mature predictive analytics catch these signals and push them into a workflow to be followed up by an appropriate response. Each signal, nevertheless deserves a response, a timely and relevant trigger message, to close the loop.

Signals & Triggers at work

With advanced and meticulous planning, a library of signals and matching triggers have to be maintained and kept up to date. Each trigger with matching collateral can then be dynamically executed. Each of these combinations are continually refined & optimized by analytics. Upon a valid declaration, each becomes a business rule. For example: Consider a leading health insurer who learns of a dependent soon to be in the ‘dependent turning 26 window‘. The customer and/or dependent promptly receives a triggered message with a limited time offer towards a new personal health insurance policy for the dependent.

Pharma & Life sciences regulatory caution is giving way to customer personalization

Regulated industries in Pharma, Life Sciences, Healthcare and others have been increasing personalization related spend. Even with an innate regulatory driven caution, the portfolio trajectory is shifting from clinical trials customer data spend to customer/patient experience related personalization spend.

Customer data & personalization together aid in the identification of high-value patients, who can then be channeled towards physicians and therapists, realizing a tailored & personalized formulary experience

Insurers have personalization high on their portfolio spend agenda

Insurers are attaining underwriting efficiencies, realizing a not so far off dream of instant insurance for Health, Life and others. They are now providing customers Population health efficiencies and Genomics focused underwriting, all a result of data personalization. The race is on and competition is rife. Personalized internal & external customer data, including IoT customer data can now reside within a CDP or a similar configuration.

Stitching the Personalization DNA within your marketing technology stack or with an addition of a CDP

Achieving growth on a large scale, across all channels and geography requires immense preparation, alignment, governance and most of all leadership and talented human capital. 

Rerouting existing operational organizational alignment, restitching operational processes and RPA driven workflows, scaling across the company incrementally and globally requires a well-articulated blueprint with an equally agile playbook.

New marketing technology configurations equipped with a ‘smart brain‘, ready to direct traffic with rules & algorithms, can finally foster a new breed of a one-to-one marketing reality.

Avoiding pitfalls and traps

A series of steps may be necessary to arm an incumbent MarTech stack to minimize ROI leakage from new technology investments.

It is easy to rack up a bill in the millions on data integra­tions that magnetically pull all this information together into a data lake, prioritizing the most valuable types of data—the kind which drive the high value use cases. Identification of the required latency for each data element is critical at this juncture. Most use cases require real-time information for a limited set of data elements, so most real-time capability can be thankfully decelerated.

Discover business use cases which the technology would drive, not vice versa

Buttressed by an agile development process formulate business use cases supported by KPIsbusiness drivers and forecasts. Existing digital analytics would aid in benchmarking, revealing delta variances and future forecasting. The market­ing process is iterative, optimized and improved with each cycle. 

Collaborate closely with the Marketing teams to uncover functional requirements

In close collaboration with the marketing operations team, functional requirements must be well articulated and documented. Juxtaposed against the numerous technology solutions & providers which make up the MarTech landscape, this step is a cornerstone effort. Multiple use cases, storyboards or multiple contexts help in identifying a certain missing technology component. This good catch may lead you to a different vendor choice and selection when augmenting your technology to fit your personalization goals.

Align organizational needs and goals with a technology approach

Major integrated Marketing Cloud suite vendors including Oracle, Adobe and Salesforce do not support the personalization ecosystem end-to-end. Brands may already have one of these suites in place. In that case, a reasonable compromise would be to extend the capability of the existing suite with the best of breed of each functionality element missing from the existing MarTech stack. The downside would be a less than tight integration to the element supporting that specific functionality or service.

A key consideration for the analytics engine – The analytics engine ‘the brain ‘ should be customizable including the algo­rithms, data features and business rules specific to your requirements. The solution set should allow you control over the inputs to the analytics engine.

Build incrementally and via pilots 

A prudent approach would be to set your sights on an iterative and incremental value delivery approach supported by well-defined pilots. With each new element added for a new use case, costs rise.

However, the incremental value attained would likely align with the incremental investment, thus justifying the investment. A phase driven and stage-gated approach would smoothen the decision making for the CFO to fund the incremental spend.

Design thinking inspired Marketing & IT with Organizational design unlocks ROI

Houston! We have a problem. A note from the Chief Digital Officer

“We have invested in Digital marketing technology to enable and fast track our digital marketing efforts but the expectant ROI is under par, blowing up our forecasts.We have invested heavily in Digital marketing technology to enable and fast track our digital marketing efforts but the expectant ROI is blowing up our forecast

The rise of the millennials and digital disruption continues to refocus attention towards Customer centricity, Customer experience, Design thinking, Customer journeys, Agile, Lean, DevOps – bringing efficiency, rapid innovation with the customer in the center.

Well, it is easier said than done. 

Marketers have evolved from ‘Don Draper’ of ‘Madmen’ fame relying on behavior, psychology and creativity alone. The not so new digital marketing world is technology-fueled, is data-driven, with a laser focused eye on campaign optimization, nudge marketing infused. It is daunting for most marketers – a smart marketing spend strategy, with lists, personas and segments cleansed and optimized, ready for Demand generation, Direct response, Lead generation, D2C and Omni-Channel retail. Then, filtered by digital identity, delivering personalized experiences, all measured by clicks and tags with KPI driven dashboard displays using the latest BI tools.

Martech & Data contribute to Machine learning, then AI kicks in auto-pilot

The marketing technology works as advertised and is getting better at delivering personalized experiences for prospects and customers which are simple, pointed and perceptive. 

The key enablers to maximize the return lie in plain sight. 

Human/Digital marketing employee – place the Digital marketer in the center

The road to global digital marketing maturity from humble nascent origins to an omni-experiential state lies more so in key human factors which confine digital marketing technology maturity, than any other. It is the human digital marketing employee psyche which needs assurances and support. how business units/segments/departments/cost centers are traditionally setup which are at times present barriers to personable collaboration.

Below: Fig1: The mind of a digital marketer when Martech is continually introduced

Fig1 aboveThe mind of a digital marketer when Martech is continually introduced

A Digital Marketer lens POV powered by Design thinking – Abstract

While all the attention is on the ‘customer’, there is a risk of alienating internal customers. These customers/partners are very tightly aligned in a mature state of digital marketing. This is low hanging fruit with a bang. Establishing close relationships pays off.

Add a 30% + ∆ on campaigns delivering optimization

Technology enablement

Organizational structures for Cost centers, Departments, Segments & BUs

How business units/segments/departments/cost centers are traditionally setup which are at times present barriers to personable collaboration.

Coalesced Workplaces translate into high performing digital marketing teams

Imagine a redesigned open workplace, co-located digital marketing team members encouraging conversations, building and nurturing crucial cross-functional relationships with digital marketing technology, analytics, tagging, data science, optimization & reporting team members. For high performing digital marketing teams to prosper, key organizational, workplace alignment, technology and data human capital residing in close vicinity lend to successful digital marketing programs.

Retail is much further along in digital marketing technology maturity

Top digital marketers can visualize complete customer journeys blindfolded. They are drawn to data-driven outcomes, shepherd data integrity with tagging governance, rapid to optimize ongoing campaigns, working in close quarters with technology partners, product managers, reporting and data science members.

Legacy Workspaces a dichotomy to Digital Marketing Technology maturity

We could dispense with the cube-driven workspace culture designed for operational efficiency which seemingly limits the key cross-functional interaction needed to see the entire marketing campaign journey in 3D & 360º; ejecting the rectangular office furniture projecting postured face-offs rather than the desired bi-directional discussion. The legacy workspace is a dichotomy to the vision and desired outcome for success. An ideal colocated workplace lends to organic innovation, relationship building & collaboration while leveraging technology and data together, with common values and goals as denominators.

Leverage common values to enrich relationships with technology enablers

“Pull here, in case of an emergency approach is best suited for successful digital marketing technology programs”

Where does the Marketing Technology Budget reside?

Embedding a Tech savvy Marketing technologist within the Marketing Cost Center is key

The new CMO is tech savvy with a left-brained and right-brained persona. The shift has occurred gradually with smart spend budgets- mostly digital and now with Ad-tech, OTT and Programmatic spend delivering on KPIs makes it even more critical how media spend is distributed between Owned / Earned / Paid. Organizations use the POEM (paid, owned, and earned media) model to drive brand awareness, distribute content, engage prospects, and guide them through the sales funnel.

Continuous learning is critical for Marketers as Technology overpowers Marketing metrics reliant on Personalization, Segmentation, Identity and delivering experiences to targeted personas.

Author: S. Ernest Paul – Say Hello – 336.287.1085

Design thinking inspired IIOT Business focused Use Cases propel Marketing success

Digital transformation is built upon Business Use cases. Selecting and prioritizing them to drive real business impact which deliver a solve for all stakeholders and begin an era of a lasting legacy is a critical first step.

From the outset, it’s essential for business KPIs to lead IIoT transformations, and for each potential use case to be tested against the business value that it is trying to create. In some cases, the best solution may not be a technology play at all.

Design thinking IIOT inspired digital transformation does not sideline the excitement of advances in quantum computing, utilizing digital-twins versus legacy simulation with real-time – huge technology WOW factors. This future gravitas would render itself surmountable value if we included real customers, marketing and sales stakeholders in the product innovation and audience discovery sessions amplified by design thinking.

It is easy to get excited about the role a digital-twin can play and being at the advent of an impending 5G rollout. It is prudent for manufacturers to steer away from technology in mind and then try to build a business case around it, and this sets them up to fail. For marketing and sales to deliver the messaging and articulate each use cases to a potential customer, it’s essential for business KPIs to lead IIoT transformations, and for each potential use case to be tested against the business value that it is trying to create. In some cases, the best solution may not be a technology play at all.

Selecting Design thinking driven Use Cases with KPI driven business impact are critical before the handoff to Marketing and Sales

The following image, one of my personal favorite creations when I switch to my marketing outfit serves as a constant reminder and nudges to never forget the customer. After all, the audience segments which IIOT serves and the beneficiaries and recipients of the advances brought about by digital transformation we owe a responsible stewardship of the message delivered by the marketing and sales team to put a shine on the north star.

Which Social Media Suite Vendors are the best of breed

Marketing Cloud versus pure play social media suites

There are a a new breed of Social media vendors – pure play Social media suites such as Khoros and Sprinklr. They are not part of the ecosystem of Mega Marketing Cloud Platforms like Salesforce, Adobe and Oracle which tend to serve as enterprise one stop shop Marketing Cloud platforms and integrate with other marketing offering and unify the capabilities and deliver a unified marketing reporting and dashboard experience.

Salesforce started this binge with the acquisition of Radian6 followed by Buddy Media, Exact Target and others to form a ‘Marketing Cloud offering with add-ons from third-party vendors serving as a App marketplace.

I selected Radian6 for Cigna as an enterprise offering for North america and 14 other countries and rolled it out for various segments of the enterprise with reporting very specific to their segment needs such as Customer experience, Product, Security and so on . I then followed that up with a Social media Listening Center of Excellence.

Oracle and Adobe soon followed with similar acquisitions and the Marketing Cloud war was on .

Then emerged niche players like Crimson Hexagon and Seismos and others whose focus was to solve very specific Use cases.

Today there are slim and trim versions of Social Suites unlike Social Studio (part of the Salesforce Marketing Cloud). These players focus on Social Media and is their primary focus and most integrate well with other marketing offering to bring about unified reporting with for example Tableau and Qlik.

Social Suites Must Unify, Consolidate, elevate Listening beyond nascent Core Social Capabilities

Brands prefer a leaner Marketing technology stack and continue to pursue social tech consolidation to stitch the disjointed ecosystem of point solutions. To address these needs response, social suites try to centralize social media capabilities across listening, organic publishing, advertising, customer response, and other secondary social capabilities. Since these nimble social suites vendors have shored up core capabilities with M&A and improved lackluster areas such as listening. Going forward, these vendors must pursue one of two paths: continue to address critical challenges within social media or move beyond it to tackle other channels and experiences like customer service or commerce. Regardless of their chosen focus, social suites vendors must deliver on brand customers’ desire to unify social media execution and analytics.

As a result of these trends, social suites customers should look for providers that:

  • Aggregate ads (paid), organic (owned), and/or listening (earned) data in one dashboard. Marketers can easily see the content impact of social through number of likes and shares but struggle to measure its marketing or business impact. Social suites provide a more holistic view of social media performance by visualizing paid, owned, and earned data together. Some go a step further in tracking social media activity against the customer lifecycle or sales funnel or against brand health and brand satisfaction. Buyers shouldn’t be satisfied with dashboards displaying standard profile and engagement metrics; instead, they should seek vendors that assess social media programs against business objectives like brand health and sales.
  • Use consumer insights from listening and response to inform marketing decisions. Social listening provides consumer insights to help brands activate, measure, and recalibrate marketing and business programs. Brands also gather critical feedback from social customer service interactions. Consumer insights and customer service feedback provide brands with a rich understanding of the consumers they’re trying to reach. Vendors tightly link their social listening and customer service modules to the rest of the social suite, allowing brands to develop marketing initiatives based on emerging trends and customer feedback. This yields stronger social media programs born out of consumers’ desires and not internal brand motivations.
  • Offer a fully interoperable platform. Social suites have prioritized uniting acquired or organically built capabilities into a single streamlined user interface. But to act on that data, brand customers also need the ability to pull data from one module into another. For example, some vendors use unified social user profiles and integrations with third-party CRM suites to facilitate cohesive customer service and then leverage that profile data for identifying influencers. Several social suites repurpose content and its associated parameters (e.g., audiences, assets, or campaign dates) across the suite in organic publishing, advertising, and user-generated content via a universal tagging or labeling system.

Social Strategy Leaders

Market Presence

Market Presence does not necessarily mean top of mind and best of breed. It merely shows market share.

Vendor Profiles

Forresters analysis uncovered the following strengths and weaknesses of individual vendors.

Leaders

  • Sprinklr competes by offering a formidable and intensely customizable unified platform. This heavyweight vendor, based in New York, presented one of the first broader visions beyond social media: to become a “customer experience management” platform and solve the chaos of using multiple point solutions across digital channels. The vendor’s execution roadmap focuses on solidifying core areas of its platform by adding to its extensive list of channels, use cases, and third-party integrations (though not with other social technology). Armed with new FedRAMP-ready status, Sprinklr is expanding its target market to include public sector organizations. While other vendors go deep in only a few social media modules, Sprinklr delivers across all social media needs at equal depth — listening, customer service, organic publishing, and advertising — plus some secondary social products like influencer management and employee advocacy. AI “smart” features persist across the platform, from autodetecting themes to ensuring compliance against guidelines to recommending customer service responses. Customer references appreciate the new Hyperspace user interface, noting that it’s more intuitive and easier to use than previous iterations. However, the vendor’s pricing model remains a vast array of add-ons and opaque pricing. Sprinklr is ideal for enterprises that have cross-functional needs and can implement a rigorous setup to make the Sprinklr engine run effectively.
  • Khoros differentiates on customer care and sets its sights beyond social media. Khoros continues to make steady progress uniting its legacy Lithium and Spredfast platforms into a single suite — and now has more work ahead to integrate new acquisitions Topbox (now called Khoros CX Insights) for customer experience analytics and Flow.ai (now Khoros Flow) for conversational AI. This vendor, headquartered in Austin, Texas, caters to enterprise buyers with a lofty vision to provide a single unified platform to “connect and improve digital customer experiences.” Khoros’ emphasis on “people first” and building relationships applies to both customers and employees and is notable for its focus on diverse and inclusive hiring efforts, employee resource groups, and corporate social initiatives. Khoros’ deep capabilities in customer care and brand communities set it apart from other vendors in this space. The vendor’s Intelligence module offers light social listening, augmented by an integration with Talkwalker for deeper listening. The Marketing module offers solid organic publishing but with limited advertising options — though it has unified paid and organic reporting. While the Care and Marketing products remain somewhat gated, users are now able to carry customer data and insights across multiple areas of the platform. A unique Vault product controls and locks down platform user access, and Khoros’ personally identifiable information (PII) redaction feature adds to its strengthened consumer privacy practices. Customer references praised the vendor’s “fantastic” and “phenomenal” account management, emphasizing its strength as a partner. For enterprises with diverse needs across departments that don’t necessarily have shared goals, Khoros is a good fit.

Strong Performers

  • Sprout Social’s culture-first mindset drives its strong service and unified social suite. Sprout Social’s emphatic “culture as a business model” go-to-market approach translates into strong account management, employee retention, and overall customer satisfaction. This Chicago vendor has a singular focus on solving social media problems and perfecting its craft there. It has taken a deliberate — and slower — approach to organically build a truly unified social suite that sits on a single code base. Sprout Social opts to hone its core capabilities rather than build or acquire an abundance of new feature and functionality. It also offers a simple and clear pricing model to serve companies ranging from small firms to large enterprises. Newcomers to Sprout Social will find the platform easy to set up and use from the start. Foundational user experience, user management, and collaboration and workflow capabilities are Sprout Social’s strengths, culminating in unified dashboards and reporting. Expansion into social commerce is evident with a soon-to-be generally available Shopify integration in the Inbox module and an Instagram bio link-to-shop feature in the Publishing module. But listening is lighter weight, and advertising is limited compared with other vendors. Bambu, the lone product that sits outside the Sprout Social platform, fulfills employee advocacy programs. Customer references gave unanimous high marks for how Sprout Social manages and services their accounts. For social marketers seeking a unified platform that checks the core boxes and is easy for all to access, Sprout Social delivers.
  • Hootsuite tackles social measurement with a patchwork of new and old acquisitions. Hootsuite’s newest acquisition of Sparkcentral for customer service joins past acquisitions AdEspresso for advertising and LiftMetrix for analytics, plus the already-integrated Brandwatch for listening, in a one-stop shop with separate URLs. Hootsuite hyperfocuses on helping marketers advance their social media maturity and solve the social measurement conundrum with assessment tools, attribution models, and a macro view of social media activity across paid, owned, and earned. Originating in Vancouver, this vendor is garnering positive feedback with its account management and professional services. Hootsuite recently received a new design makeover and continues to refine its legacy publishing and Streams products. It’s working to reconcile dashboards (Analytics vs. Impact), customer service (Inbox vs. Assignments vs. Sparkcentral), and advertising (Ads vs. Publisher) and would benefit from also reconciling listening (Streams vs. Insights) in the platform. Hootsuite also leans on a variety of third-party integrations for ratings and reviews, content discovery, and regulatory needs. The result is a social suite packed with functionality, but disparate and duplicative elements abound. Customer references confirmed that the user experience has improved but still feels disjointed. Hootsuite also offers an Amplify employee advocacy product for companies arming employees with social media content, especially regulated industries or the public sector, with its new FedRAMP certification. Hootsuite’s à la carte menu of social media capabilities is good for social media managers with an array of needs, large and small.
  • Socialbakers, now Emplifi, delivers primarily social marketing and analytics in a clean UI. Astute Solutions acquired Socialbakers, based in Prague, Czech Republic, and rebranded to Emplifi in July 2021 (after the time of this evaluation), setting in motion a three-pronged vision of social marketing, care, and commerce within a customer experience cloud. This vendor’s roadmap tightly aligns to development in those three focus areas. However, during this evaluation, legacy Socialbakers and Astute Solutions have remained separate platforms with different pricing models. For the time being, the former continues to deliver its core listening, marketing, and analytics offerings under a new product name: Emplifi Social Marketing Cloud. Socialbakers impresses with a streamlined user interface and unified social suite that uses labels to carry data across modules for listening, personas, and content. Customer references confirmed that the UI was intuitive and a reason for buying Socialbakers. Unlike other social suites, this vendor focuses on content discovery rather than creation. Social listening is spread across Content, Audiences, Influencers, and Analytics modules and is designed to discover content, audiences, and influencers — though data is disparate, and users can’t view it all in one place. But dashboards do prolifically compare data with previous periods; customer references appreciate this but stated that they lack flexibility and deeper metrics. Also noteworthy: Its benchmarking product uses aggregated Socialbakers brand customer data to index against competitors. Socialbakers is interesting for social marketers who are seeking an array of data visualizations and are eager for Emplifi to integrate care and commerce in the future.

Contenders

  • Falcon.io fortifies listening by adding Brandwatch to its everyday social suite. Two years ago, Cision acquired Falcon.io out of Copenhagen to shore up social media offerings for its PR and communications buyers. This year, Cision acquired Brandwatch, a social listening platform, and quickly fused it with Falcon.io, an original social media management solution, to yield a stronger social suite and target more enterprise prospects. However, the assembled collection — spanning Falcon.io, previously acquired Unmetric for competitive benchmarking, and now Brandwatch — yields a more disjointed social suite than its previous unified platform. Independently, Falcon.io’s vision remains firmly rooted in social media use cases that benefit midmarket customers. Falcon.io’s Listen module is augmented by Brandwatch for deeper listening needs, though users must use a separate URL because the integration isn’t yet accessible from the main navigation. Falcon.io’s Engage module for customer service provides chatbot capabilities and other on-par features. The Publish module offers limited organic publishing functionality, with no ability to post simultaneously on multiple social media platforms. While the Advertise module provides users with some advertising capabilities, it’s restricted to Facebook and Instagram. Falcon.io notably offers its own CRM in its Audiences module and leans on its Unmetric integration for competitive benchmarking. Customer references appreciate the persistent labeling system that enables data interoperability throughout the suite but wish some modules — such as Advertise and Benchmark — were better integrated. Falcon.io is a good fit for midmarket companies or enterprises that need standard social media execution.
  • Meltwater is strong in listening but lacks functional depth in other social media needs. Meltwater aims to connect the worlds of media relations and marketing with a vision of helping brand customers understand, influence, and engage with consumers. The vendor’s niche but targeted approach is a nod to the combined legacies of Meltwater and Sysomos, which Meltwater acquired in 2018. Since then, the company, founded in Oslo, Norway, but based in San Francisco, has continued its active M&A streak. The 2021 additions of Linkfluence, a social listening platform, and Klear, an influencer marketing solution, are helping Meltwater expand to new influencer marketing and consumer insights use cases while reducing its reliance on partnerships and integrations. Meltwater is strongest in its Explore module for listening, with rich data sources that include news and broadcast media, as well as a unique podcast integration that monitors media coverage on audio content. The vendor also offers embedded, white-labeled features, such as audience analysis from Audiense, UGC management from TINT, and data visualizations from Tickr. The Engage module offers organic publishing, customer response, and advertising but lacks some functionality that other vendors provide, such as comprehensive organic and paid post creation and robust customer response. Analyze dashboards aim to bridge Explore and Engage modules by housing owned and earned sentiment, paid data, and competitive benchmarking in a single place. Meltwater is a good fit for PR, corporate communications, and marketing buyers from a wide range of business sizes. Meltwater declined to participate in the full Forrester Wave evaluation process.
  • Facelift keeps consumer privacy top of mind but lags in feature development. Under DuMont Media Group ownership, Facelift’s mission is to reduce complexity; provide rapid time-to-value; and offer a reliable, scalable, and secure platform for its predominantly European customers. Operating out of Hamburg, Germany, the vendor works to enable all departments to use centralized campaign templates at the regional or local level. Facelift has historically stayed in a social-media-only swim lane but is now pushing into broader digital marketing planning orchestration and more aggressive investment in growth. Unlike some other vendors, Facelift’s commercial model is transparent and easy to follow. Facelift Cloud is strongest in organic publishing: The planner tool provides a bird’s-eye view across campaigns, whereas the publisher tool executes the content. The vendor also offers lightweight listening with proprietary Trendwatch and a more sophisticated option via an integration with Talkwalker skinned inside Facelift Cloud. The Moderation module for customer service and the Advertising module (which copies the Facebook Ads Manager interface) both offer basic functionality. Facelift Cloud’s user experience, user management, and collaboration and workflow capabilities strive for uniform efficiency but don’t allow for much customization. Customer references praised Facelift Cloud’s simple and easy-to-use features but expressed a desire for more-flexible dashboards and reports offering more metrics (both in number and relevancy). For European customers that have straightforward hub-and-spoke social marketing needs and seek strong security and data privacy, Facelift is a solid option.

Challengers

  • Reputation manages multilocation brands’ reviews, but social capabilities are nascent. Compared with other vendors in this evaluation, this vendor, based in Redwood City, California, has a unique heritage in ratings and reviews management and serving multilocation companies with decentralized needs. With its recent acquisition of Nuvi for social listening, Reputation aims to present a holistic “reputation experience management” platform enabling continuity from central headquarters to local regions. However, its roadmap focuses on integrating Nuvi into the core Reputation platform and enhancing social media functionality that other social suites may already offer. Reputation’s social suite exists as a single module within the larger reputation management platform, with Nuvi sitting outside the platform for now. Nuvi’s social listening platform offers an endless menu of visualizations with notable emotion and attribute analysis, an improvement over Reputation’s existing listening product. Reputation also offers organic social publishing but leans on its bread and butter: ratings and reviews management and the aptly named proprietary Reputation Score. The vendor’s scant customer service options are split between the social suite module and Nuvi, and it doesn’t offer advertising within the platform. Customer references noted some account management challenges as the company scaled. Reputation is best suited for multilocation brands’ niche reputation experience, reviews management, and social marketing needs.

Evaluation Overview

We evaluated vendors against 36 criteria, which we grouped into three high-level categories:

  • Current offering. Each vendor’s position on the vertical axis of the Forrester Wave graphic indicates the strength of its current offering. Key criteria for these solutions include social listening, social customer response, and social organic publishing.
  • Strategy. Placement on the horizontal axis indicates the strength of the vendors’ strategies. We evaluated product vision, execution roadmap, onboarding and account management, supporting services, performance, and commercial model.
  • Market presence. Represented by the size of the markers on the graphic, our market presence scores reflect each vendor’s revenue and customers.

Vendor Inclusion CriteriaForrester included nine vendors in the assessment: Facelift, Falcon.io, Hootsuite, Khoros, Meltwater, Reputation, Socialbakers, Sprinklr, and Sprout Social. Each of these vendors has:

  • Annual social suites revenue above $40 million. Each vendor had a social suites revenue of more than $40 million in 2020.
  • Social suites that combine multiple social tech capabilities into a single unified platform. This includes social listening, organic publishing, and customer response, plus at least one other social technology capability (e.g., influencer marketing, social community, or employee advocacy).

We are Digitalbrine, a Full service Digital Agency part of the Beyondiris Consulting family.

Fee free to reach out for consulting or Social media Services.

IIOT – Manufacturing reimagined for Industry 4.0

The COVID-19 pandemic has highlighted, steered and illuminated how the Industrial IoT (IIoT), or Industry 4.0, can enhance organizational resilience in a state of crisis. Digital management tools and connectivity, for example, have enabled organizations to react to market changes faster and more efficiently. 

Industrial IoT, or the Industrial Internet of Things (IIoT), is a vital element. IIoT harnesses the power of smart machines and real-time analysis to make better use of the data that industrial machines have been churning out for years. The principal driver of IIoT is smart machines, for two reasons. The first is that smart machines capture and analyze data in real-time, which humans cannot. The second is that smart machines communicate their findings in a manner that is simple and fast, enabling faster and more accurate business decisions.

Specifically, the market has seen the convergence of information technology (IT) and operational technology (OT) due to advances and synergies between the respective areas. This has resulted in the Industrial Internet of Things (IIoT), which is a solution that collects and centralizes mass amounts of machine data gathered from industrial environments. Applications built on these IoT platforms collect, analyze, and enable you to quickly act on the data to fundamentally boost operational efficiency and production.

Thanks to continuous streams of real-time data, it’s now possible to create a digital twin of virtually any product or process, enabling manufacturers to detect physical issues sooner, predict outcomes more accurately, and build better products.

While its output is a physical object, manufacturing inevitably begins with data during the design phase. That data is communicated to machines that execute designs—the point of transition between the digital and physical worlds. Increasingly, additional data is captured during manufacturing and eventual use of the final product. This data, in turn, can be extremely valuable for informing future designs and modifications, creating a virtuous cycle of innovation and improvement.

Put all those pieces together, and it’s clear that a digital “thread” of data now flows continuously. Aggregated and integrated in real time, it can be used to stitch together the physical and digital worlds, creating a virtual replica of a product or process that can reveal significant new insight. This digital thread can enable the digital twin by providing the data it needs to function.

The digital twin of a complex product such as a jet engine or large mining truck, for example, can monitor and evaluate wear and tear as the equipment is used in the field, potentially leading to design changes over time and informing predictive maintenance. The digital twin of a process can replicate what is happening on the factory floor (Figure 1). Sensors distributed throughout can capture data along a wide array of dimensions, from behavioral characteristics of the production machinery to characteristics of works in progress (thickness, color qualities, hardness, torque, and so on) and environmental conditions within the factory itself. Analyzed over time, these incoming data streams can uncover performance trends, potentially triggering changes to some aspect of the manufacturing process in the physical world.

Technologies enabling digital twins include sensors that measure critical inputs from the physical process or product and its surroundings. Signals from these sensors may be augmented with process-based information from systems such as manufacturing execution systems, ERP systems, CAD models, and supply chain systems. Those data streams are then securely delivered for aggregation and ingestion into a modern data repository, followed by processing and preparation for analytics. Artificial intelligence and other techniques can be used for analysis; the resulting insights can then be fed back to the physical world through decoders and actuators for implementation via additive manufacturing, robotics, or other tools.

A Real-World Example

An industrial manufacturer was facing numerous quality issues in the field, resulting in costly maintenance and high warranty liability. To address these problems, its engineering and supply network organizations pursued a digital twin approach. First, they combined the as-designed bill of materials (BOM) with all the analogous information produced during manufacturing (also known as the as-manufactured BOM), including procured parts details and assembly details. That step allowed them to run analytics and glean insights into production variations affecting quality. As a result, the team was able to improve the assembly process, reducing rework by 15 to 20 percent.

The manufacturer’s after-sales department is now planning to apply the digital twin process to information from products in the field (the as-maintained BOM) as well to better understand how process variation in field maintenance affects performance and to identify further potential improvements. All in all, capturing a variety of live measurements from the as-designed, as-manufactured, and as-maintained BOMs amounts to a cradle-to-grave digital journey, creating opportunities for better asset availability management, spare parts inventory optimization, predictive maintenance, and services.

“As a result, the team was able to improve the assembly process, reducing rework by 15 to 20 percent.”

The IIoT has already gained traction within countless industries, including manufacturing, food and beverage, oil and gas, healthcare, automotive, and more. For machine builders, it is quickly becoming a business imperative. According to an IDG and Siemens IoT survey, 53 percent of companies have started an IoT initiative. To keep pace with leaders in the industry, you need to start acting now.

How to Conduct a Technical SEO Audit

The technical elements of your website’s SEO are crucial to search performance. Understand and maintain them and your website can rank prominently, drive traffic, and help boost sales. Neglect them, and you run the risk of pages not showing up in SERPs.

In this article, you’ll learn how to conduct a technical SEO audit to find and fix issues in your website’s structure. We’ll look at key ranking factors including content, speed, structure, and mobile-friendliness to ensure your site can be crawled and indexed. 

We’ll also show you the tools you need to boost on-page and off-page SEO efforts and performance, and how to use them.

Using a technical SEO audit to improve your SEO performance 

Think of a technical SEO audit as a website health check. Much like you periodically review your digital marketing campaigns to get the most from them, a technical SEO audit identifies areas for improvement. 

These areas fall into three categories:

1. Technical errors

Identifying red flags in the back and front-end of your website that negatively impact performance and, thus, your SEO. Technical errors include crawling issues, broken links, slow site speed, and duplicate content. We’ll look at each of these in this article. 

2. UX errors

User experience (UX) tends to be thought of more as a design issue rather than an SEO one. However, how your website is structured will impact SEO performance.

To better understand what pages are important and which are lower priority, Google uses an algorithm called Page Importance. 

Page Importance is determined by type of page, internal and external links, update frequency, and your sitemap. More importantly from a UX perspective, however, it’s determined by page position. In other words, where the page sits on your site.

This makes website architecture an important technical SEO factor. The harder it is for a user to find a page, the longer it will take Google to find it. Ideally, a user should be able to get to where in as few clicks as possible

A technical SEO audit addresses issues with site structure and accessibility that prevent them from doing this.  

3. Ranking opportunities

Technical SEO is as important as On-Page SEO As well as prioritizing key pages in your site architecture, an audit helps convince Google of a page’s importance by: 

  • Identifying and merging content targeting the same or similar keywords;
  • Removing duplicate content that dilutes importance, and; 
  • Improving metadata so that users see what they’re looking for in search engine results pages (SERPs).  

It’s all about helping Google understand your website better so that pages show up for the right searches.

As with any kind of health check, a technical SEO audit shouldn’t be a one-and-done thing. It should be conducted when your website is built or redesigned, after any changes in structure, and periodically.

The general rule of thumb is to carry out a mini-audit every month and a more in-depth audit every quarter. Sticking to this routine will help you monitor and understand how changes to your website affect SEO performance.

6 tools to help you perform a technical SEO audit 

Here are the SEO tools we’ll be using to perform a technical audit: 

These tools are free, with the exception of Screaming Frog which limits free plan users to 500 pages. 

If you run a large website with more than 500 pages, Screaming Frog’s paid version offers unrestricted crawling for $149 per year. 

Alternatively, you can use Semrush’s Site Audit Tool (free for up to 100 pages) or Ahrefs Site Audit tool. Both perform a similar job, with the added benefits of error and warning flagging, and instructions on how to fix technical issues.  

1. Find your robots.txt file and run a crawl report to identify errors

The pages on your website can only be indexed if search engines can crawl them. Therefore, before running a crawl report, look at your robots.txt file. You can find it by adding “robots.txt” to the end of your root domain:

https://yourdomain.com/robots.txt 

The robots.txt file is the first file a bot finds when it lands on your site. The information in there tells them what they should and shouldn’t crawl by ‘allowing’ and ‘disallowing’.  

Search crawlers don’t crawl certain parts of its site. These are back-end folders that don’t need to be indexed for SEO purposes.

By disallowing them, the website is able to save on bandwidth and crawl budget—the number of pages Googlebot crawls and indexes on a website within a given timeframe.

If you run a large site with thousands of pages, like an ecommerce store, using robots.txt to disallow pages that don’t need indexing will give Googlebot more time to get to the pages that matter. 

What robots.txt also does is point bots at its sitemap. This is good practice as your sitemap provides details of every page you want Google and Bing to discover (more on this in the next section).  

Look at your robots.txt to make sure crawlers aren’t crawling private folders and pages. Likewise, check that you aren’t disallowing pages that should be indexed.

If you need to make changes to your robots.txt, you’ll find it in the root directory of your webserver (if you’re not accustomed to these files, it’s worth getting help from a web developer). If you use WordPress, the file can be edited using the free Yoast SEO plugin. Other CMS platforms like Wix let you make changes via in-built SEO tools

Run a crawl report to check that your website is indexable

Now that you know bots are being given the correct instructions, you can run a crawl report to check that pages you want to be indexed aren’t being hampered.

Enter your URL into Screaming Frog, or by going to Index > Coverage in your Google Search Console. 

Each of these tools will display metrics in a different way. 

Screaming Frog looks at each URL individually, splitting indexing results into two columns:

1. Indexability: This shows where a URL is indexable or non-indexable

2. Indexability status: The reason why a URL is non-indexable 

The Google Search Console Index Coverage report displays the status of every page of your website. 

Google Search Engine Console

The report shows:

  • Errors: Redirect errors, broken links, and 404s
  • Valid with warnings: Pages that are indexed but with issues that may or may not be intentional
  • Valid: Successfully indexed pages
  • Excluded: Pages excluded from indexing due to reasons such as being blocked by the robots.txt or redirected

Flag and fix redirect errors to improve crawling and indexing

All pages on your website are assigned an HTTP status code. Each code relates to a different function.

Screaming Frog displays these in the Status Code column: 

Screaming Frog Indexability Report

All being well, most of the pages on your website will return a 200 status code, which means the page is OK. Pages with errors will display a 3xx, 4xx, or 5xx status code. 

Here’s an overview of codes you might see in your audit and how to fix the ones that matter:

3xx status codes

  • 301: Permanent redirect. Content has been moved to a new URL and SEO value from the old page is being passed on.

301s are fine, as long as there isn’t a redirect chain or loop that causes multiple redirects. For example, if redirect A goes to redirect B and C to get to D it can make for a poor user experience and slow page speed. This can increase bounce rate and hurt conversions. To fix the issue you’ll need to delete redirects B and C so that redirect A goes directly to D.

By going to Reports > Redirect Chains in Screaming Frog, you can download the crawl path of your redirects and identify which 301s need removing. 

  • 302: Temporary redirect. Content has been moved to a URL temporarily.

302s are useful for purposes such as A/B testing, where you want to trial a new template or layout. However, if 302 has been in place for longer than three months, it’s worth making it a 301. 

  • 307: Temporary redirect due to change in protocol from the source to the destination. 

This redirect should be used if you’re sure the move is temporary and you’ll still need the original URL. 

4xx status codes

  • 403: Access forbidden. This tends to display when content is hidden behind a login.
  • 404: Page doesn’t exist due to a broken link or when a page or post has been deleted but the URL hasn’t been redirected.

Like redirect chains, 404s don’t make for a great user experience. Remove any internal links pointing at 404 pages and update them with the redirected internal link. 

  • 410: Page permanently deleted.

Check any page showing a 410 error to ensure they are permanently gone and that no content could warrant a 301 redirect.  

  • 429: Too many server requests in a short space of time.

5xx status codes 

All 5xx status codes are server-related. They indicate that the server couldn’t perform a request. While these do need attention, the problem lies with your hosting provider or web developer, not your website.

Set up canonical tags to point search engines at important pages

Canonical meta tags appear in the <head> section in a page’s code.

<link rel=”canonical” href=”https://www.yourdomain.com/page-abc/” />

They exist to let search engine bots know which page to index and display in SERPs when you have pages with identical or similar content.

For example, say an ecommerce site was selling a blue toy police car and that was listed under “toys > cars > blue toy police car” and “toys > police cars > blue toy car”. 

It’s the same blue toy police car on both pages. The only difference is the breadcrumb links that take you to the page. 

By adding a canonical tag to the “master page” (toys > cars), you signal to search engines that this is the original product. The product listed at “toys > police cars > blue toy car” is a copy. 

Another example of when you’d want to add canonical tags is where pages have added URL parameters. 

For instance, “https://www.yourdomain.com/toys” would show similar content to “https://www.yourdomain.com/toys?page=2” or “https://www.yourdomain.com/toys?price=descending” that have been used to filter results

Without a canonical tag, search engines would treat each page as unique. Not only does this mean having multiple pages indexed thus reducing the SEO value of your master page, but it also increases your crawl budget. 

Canonical tags can be added directly to the <head> section in the page code of additional pages (not the main page) or if you’re using a CMS such as WordPress or Magneto, plugins like Yoast SEO that make the process simple.

2. Review your site architecture and sitemap to make content accessible

Running a site crawl helps to address most of the technical errors on your website. Now we need to look at UX errors. 

Content Writing for SEO – S. Ernest Paul

S. Ernest Paul

As we mentioned at the top, a user should be able to get to where they want to be on your site in a few clicks. An easier human experience is synonymous with an easier search bot experience (which, again, saves on crawl budget).

As such, your site structure needs to be logical and consistent. This is achieved by flattening your website architecture.

Here are examples of complicated site architecture and simple (flat) site architecture from Backlinko’s guide on the topic: 

You can see how much easier it is in the second image to get from the homepage to any other page on the site.

The closer a page is to your homepage, the more important it is. Therefore, you should look to regroup pages based on keywords to bring those most relevant to your audience closer to the top of the site.

A flattened website architecture should be mirrored by its URL structure. 

To create a consistent SEO strategy and organize the relationship between pieces of content, use the hub-and-spoke method.

S. Ernest Paul describes this method as “an internal linking strategy that involves linking several pages of related content (sometimes referred to as “spoke” pages) back to a central hub page.”

Courtesy: Portent

 “Conversion rate optimization” guide is the hub, “21 Key Intake fields for Content writers by S. Ernest Paul is a spoke

Depending on the size of your website, you may need help from a web developer to flatten the architecture and overhaul navigation. However, you can improve user experience easily by adding internal links to relevant pages.

Organize your sitemap to reflect your website structure

The URLs that feature on your site should match those in your XML sitemap. This is the file that you should point bots to in your robots.txt as a guide to crawl your website.

Like robots.txt, you can find your XML sitemap by adding “sitemap.xml” to the end of your root domain:

https://yourdomain.com/sitemap.xml 

If you’re updating your site architecture, your sitemap will also need updating. A CMS like WordPress, Yoast SEO, or Google XML sitemaps can generate and automatically update a sitemap whenever new content is created. Other platforms like Wix and Squarespace also have built-in features that do the same.

If you need to do it manually, XML-sitemaps will automatically generate an XML sitemap that you can paste into your website’s (/) folder. However, you should only do this if you’re confident handling these files. If not, get help from a web developer.

Once you have your updated sitemap, submit it at Index > Sitemaps in the Google Search Console. 

From here, Google will flag any crawlability and indexing issues.

Working sitemaps will show a status of “Success”. If the status shows Has errors or Couldn’t fetch there are likely problems with the sitemap’s content. 

As with your robots.txt file, your sitemap should not include any pages that you don’t want to feature in SERPs. But it should include every page you do want indexing, exactly how it appears on your site.

For example, if you want Google to index “https://yourdomain.com/toys”, your sitemap should copy that domain exactly, including the HTTPS protocol. “http://yourdomain.com/toys” or “/toys” will mean pages aren’t crawled.

3. Test and improve site speed and mobile responsiveness

Site speed has long been a factor in search engine rankings. Google first confirmed as much in 2010. In 2018, they upped the stakes by rolling out mobile page speed as a ranking factor in mobile search results. 

When ranking a website based on speed, Google looks at two data points:

1. Page speed: How long it takes for a page to load

2. Site speed: The average time it takes for a sample of pageviews to load

When auditing your site, you only need to focus on page speed. Improve page load time and you’ll improve site speed. Google helps you do this with its PageSpeed Insights analyzer. 

Enter a URL and PageSpeed Insights will grade it from 0 to 100. The score is based on real-world field data gathered from Google Chrome browser users and lab data. It will also suggest opportunities to improve.

Poor image,  JavaScript,  CSS file optimization, and browser caching practices tend to be the culprits of slow loading pages. Fortunately, these are easy to improve:

  • Reduce the size of images without impacting on quality with Optimizilla or Squoosh. If you’re using WordPress, optimization plugins like Imagify Image Optimizer and TinyPNG do the same job. 
  • Reduce JavaScript and CSS files by pasting your code into Minify to remove whitespace and comments
  • If you’re using WordPress, use W3 Total Cache or WP Super Cache to create and serve a static version of your pages to searchers, rather than having the page dynamically generated every time a person clicks on it. If you’re not using WordPress, caching can be enabled manually in your site code.

Start by prioritizing your most important pages. By going to Behavior > Site Speed in your Google Analytics, metrics will show how specific pages perform on different browsers and countries: 

Check this against your most viewed pages and work through your site from the top down.

How to find out if your website is mobile-friendly

In March 2021, Google launched mobile-first indexing. It means that pages Google indexes will be based on the mobile version of your site. Therefore, the performance of your site on smaller screens will have the biggest impact on where your site appears in SERPs. 

Google’s Mobile-Friendly Test tool is an easy way to check if your site is optimized for mobile devices:  

If you use a responsive or mobile-first design, you should have nothing to worry about. Both are developed to render on smaller screens and any changes you make as a result of your technical SEO audit will improve site and search performance across all devices.   

You can test your site on real devices using BrowserStack’s responsive tool

Standalone mobile sites should pass the Google test, too. Note that separate sites for mobile and desktop will require you to audit both versions.

Another option for improving site speed on mobile is Accelerated Mobile Pages (AMPs). AMP is a Google-backed project designed to serve users stripped-down versions of web pages so that they load faster than HTML. 

Google has tutorials and guidelines for creating AMP pages using code or a CMS plugin. However, it’s important to be aware of how these will affect your site. 

Every AMP page you create is a new page that exists alongside the original. Therefore, you’ll need to consider how they fit into your URL scheme. Google recommends using the following URL structure:

http://www.example.com/myarticle/amp 

http://www.example.com/myarticle.amp.html

You’ll also need to ensure that canonical tags are used to identify the master page. This can be the AMP page, but the original page is preferred. This is because AMP pages serve a basic version of your webpage that doesn’t allow you to earn ad revenue or access the same deep level of analytics.

AMP pages will need to be audited in the same way as HTML pages. If you’re a paid subscriber, Screaming Frog has features to help you find and fix AMP issues. You can do this in the free version, but you’ll need to upload your list of pages. 

5. Find and fix duplicate content and keyword cannibalization issues to fine-tune SEO

By this stage, your content audit has already begun. Adding canonical tags ensures master pages are being given SEO value over similar pages. Flattened site architecture makes your most important content easy to access. What we’re looking to do now is fine-tune. 

Review your site for duplicate content

Pages that contain identical information aren’t always bad. The toy police car pages example we used earlier, for instance, are necessary for serving users relevant results. 

 They become an issue when you have an identical page to the one you’re trying to rank for. In such cases, you’re making pages compete against each other for ranking and clicks, thus diluting their potential. 

As well as product pages, duplicate content issues can occur for several reasons:

  • Reusing headers, page titles, and meta descriptions to make pages appear identical even if the body content isn’t 
  • Not deleting or redirecting identical pages used for historical or testing purposes
  • Not adding canonical tags to a single page with multiple URLs

A site crawl will help identify duplicate pages. Check content for duplication of:

  • Titles
  • Header tags
  • Meta descriptions
  • Body content

You can then either remove these pages or rewrite the duplicated elements to make them unique. 

Merge content that cannibalizes similar keywords

Keyword cannibalization is like duplicate content in that it forces search engines to choose between similar content. 

It occurs when you have various content on your site that ranks for the same query. Either because the topic is similar or you’ve targeted the same keywords.  

For example, say you wrote two posts. One on “How to write a resume” optimized for the phrase “how to write a resume” and the other on “Resume writing tips” optimized for “resume writing.”

The posts are similar enough for search engines to have a hard time figuring out which is most important.    

Googling “site: yourdomain.com + ‘keyword” will help you easily find out if keyword cannibalization is a problem. 

If your posts are ranking #1 and #2, it’s not a problem. But if your content is ranking further down the SERPs, or an older post is ranking above an updated one, it’s probably worth merging them:

  1. Go to the Performance section of your Google Search Console. 
  2. From the filters click New > Query and enter the cannibalized keyword. 
  3. Under the Pages tab, you’ll be able to see which page is receiving the most traffic for the keyword. This is the page that all others can be merged into.

For example, “How to write a resume” could be expanded to include resume writing tips and become a definitive guide to resume writing.

It won’t work for every page. In some instances, you may want to consider deleting content that is no longer relevant. But where keywords are similar, combining content will help to strengthen your search ranking.

Improve title tags and meta descriptions to increase your click-through rate (CTR) in SERPs

While title tags and meta descriptions aren’t a ranking factor, there’s no denying they make a difference to your curb appeal. They’re essentially a way to advertise your content.

Performing a technical SEO audit is the ideal time to optimize old titles and descriptions, and fill in any gaps to improve CTR in SERPs.

Titles and descriptions should be natural, relevant, concise, and employ your target keywords. Here’s an example from the search result for Copyhackers’ guide to copywriting formulas:

The meta description tells readers they’ll learn why copywriting formulas are useful and how they can be applied in the real world.

While the title is also strong from an SEO perspective, it’s been truncated. This is likely because it exceeds Google’s 600-pixel limit. Keep this limit in mind when writing titles. 

Include keywords close to the start of titles and try to keep characters to around 60. Moz research suggests you can expect ~90% of your titles to display properly if they are below this limit.

Similarly, meta descriptions should be approximately 155-160 characters to avoid truncation.

It’s worth noting that Google won’t always use your meta description. Depending on the search query, they may pull a description from your site to use as a snippet. That’s out of your control. But if your target keywords are present in your meta tags, you’ll give yourself an edge over other results that go after similar terms. 

Conclusion

Performing a technical SEO audit will help you analyze technical elements of your website and improve areas that are hampering search performance and user experience. 

But following the steps in this article will only resolve any problems you have now. As your business grows and your website evolves, and as Google’s algorithms change, new issues with links, site speed, and content will arise. 

Therefore, technical audits should be part of an ongoing strategy, alongside on-page and off-page SEO efforts. Audit your website periodically, or whenever you make structural or design updates to your website.

Courtesy: CXL

The emotions generated by colors

#BRAND #emotion#influence#nudge#brandexperience#brandinspiration all originated with flowers and dyes.

The ‘Temperamenten-Rose’ compiled by Goethe and Schiller in 1798/9.

#contemporaryart#art#creativity#fashion#creativity#illustration#photography#illustration#illustration#illustration#photography#graphicdesign#graphicdesign#creative#teachers#artists#artwork#artwork#artwork#artwork#artist

  • The diagram matches twelve colors to human occupations or their character traits, grouped in the four temperaments:
  • choleric (red / orange / yellow)
  • tyrants, heroes, adventurers; sanguine (yellow / green / cyan)
  • hedonists, lovers, poetsphlegmatic (cyan / blue / violet)
  • public speakers, historians, teachers; melancholic (violet / magenta / red): philosophers, pedants, rulers.

Google delay cookie removal until 2023

Yesterday, Google announced an updated timeline for its Privacy Sandbox milestones; in its blog post are two major announcements that marketers should focus on:

  1. Google says it’s planning to develop a more rigorous process to test and deploy Privacy Sandbox proposals across various use cases, like ad measurement, targeting, and fraud detection. The goal is to deploy these by late 2022, help scale adoption, and only then start to deprecate third-party cookies. Under this plan, 3P cookies will be phased out over a three-month period in 2023.
  2. Google is concluding the first (current) trial for Federated Learning of Cohorts (FLoC). It received feedback on the first implementation of FLoC and intends to incorporate that into future testing. (The FLoC test has faced some challenges, from its use by advertising technology vendors to build persistent profiles to its inability to be used by marketers in regulated industries.)

Google also indicated that these changes will allow for “…public discussion on the right solutions, continued engagement with regulators, and for publishers and the advertising industry to migrate their services.”

Marketers should not take this announcement as a signal to ease up on their preparations for a future without third-party cookies. Google continues to evolve its plans, and this likely won’t be the last time the company does this.

So, don’t let this distract you from the larger context of the moment: As an industry, we are transitioning away from opaque consumer data collection and usage and toward a choice-driven, transparent, and privacy-friendly future. Marketers must:

  • Continue to future-proof current targeting, digital media buying, and measurement strategies. Keep testing contextual advertising, first-party-based targeting, and cleanly sourced second-party audience segments using Forrester’s “The Future Of Audience Targeting” research as your guide.
  • Talk to your technology and services partners about how they are preparing for a data-deprecated future. Why do they believe their proposed approach(es) is sustainable, and what steps do they suggest you take on your own data deprecation journey?
  • Keep investing in your first- and zero-party data assets. Identify moments in the customer journey where you can collect behaviors, preferences, context, and intentions. Then, ensure that you’re optimizing opportunities to use that data with an eye toward delivering a valuable consumer experience.

Google said the delay would give it more time to get publishers, advertisers and regulators comfortable with the new technologies it is developing to enable targeted ads after cookies are phased out.

“While there’s considerable progress with this initiative, it’s become clear that more time is needed across the ecosystem to get this right,” Google said.

Google’s decision reflects the challenges tech giants face as they try to address demands for stronger user-privacy protections without rattling the $455 billion online-ad ecosystem or inviting complaints that they are giving themselves special advantages. Apple Inc. has rolled out several major privacy updates for its devices this year, including a requirement that all apps get users’ permission to track them. Google and Apple have each faced complaints from the ad industry that the changes they’re making will strengthen their own ad businesses.

Earlier this week, the European Union said it is investigating Google’s plan to remove cookies as part of a wide-ranging inquiry into allegations that Google has abused its prominent role in advertising technology.

Google has separately pledged to give the U.K.’s competition watchdog at least 60 days’ notice before removing cookies to review and potentially impose changes to its plan, as part of an offer to settle a similar investigation. That probe stemmed from complaints that Chrome’s removal of cookies would give an advantage to ads on Google’s own products, like YouTube or Search, where Google will still be able to do individual-level targeting.

In the U.S., Google’s cookie-replacement plan was raised in a December antitrust lawsuit against the company brought by Texas and nine other U.S. states.

Google plays a central role in the online advertising ecosystem. It owns the dominant tools used to broker the sale of ads across the web. Cookies, small bits of code stored in web browsers to track users across the web, are widely used in the industry, including in Google’s Chrome browser, which has 65% of the market globally, according to Statcounter.

Google’s delay was met with relief by advertisers and publishers, who will have more time to test and adapt to the technology that replaces cookies. Ellie Bamford, global head of media at RGA, a digital ad firm owned by Interpublic Group of Cos., said Google “underestimated the fear that marketers had about what this would mean and the level of preparedness marketers need to have.”

Paul Bannister, chief strategy officer at blog network CafeMedia, said that since the vast majority of digital advertising is powered by cookies, “it’s critical that the replacement technologies get things right. It’s also critical to make sure that even more money doesn’t go to the tech giants in the process.”

Google has been testing several new tools to replace various functions of third-party cookies, as part of what it calls a privacy sandbox. The first such replacement technology, dubbed federated learning of cohorts, or Floc, is intended to allow advertisers to target cohorts of users with similar interests, rather than individuals, in order to protect their privacy.

But early technical testing of Floc, which began in April, has been slow. Initially, Google indicated it would allow advertisers to purchase ads for Floc in the second quarter as part of Google’s tests. Google later shifted that time frame to the third quarter, ad executives said.

Ad-industry players have also expressed skepticism about Google’s claims that targeting ads with Floc is at least 95% as effective as cookie-based targeting. Google has “struggled to build confidence in Floc,” said Jayna Kothary, global chief technology officer at MRM, a marketing agency that is part of Interpublic Group of Cos. Most advertisers don’t believe Floc is 95% as effective as cookies and “the early experiments haven’t proven this yet,” she said.

Google engineer Michael Kleber said at a developer conference in mid-May that the company is working out answers to how Floc should eventually work.

“We don’t have that ready yet because we don’t know what the answers are,” Mr. Kleber said. He added that everything “about how Floc works is very much subject to change.” The acronym Floc was chosen to reflect a flock of birds, Mr. Kleber said.

Google said Thursday it has “received substantial feedback from the web community” during the initial testing of Floc.

The company also said it plans to complete testing of all of its new cookie-replacement technologies, and integrate them into Chrome before late 2022. Then digital publishers and the digital advertising industry will have a nine-month period to migrate their services, during which time Google will monitor adoption and feedback.

The final phaseout of cookies will happen over three months in late 2023, the company said, adding that it will publish a more detailed timeline.

Two rival web browsers that promote privacy, Mozilla’s Firefox and Brave, have said they aren’t supporting Floc. Some prominent websites have debated whether to opt out of using the system. And the Electronic Frontier Foundation, a digital rights group, says Floc could be misused to help with device fingerprinting, a technique to identify specific web browsers without relying on cookies. That could potentially reveal sensitive information gleaned from web browsing, despite safeguards Google says it’s building, the rights group says.

On Thursday, Google said it is making progress in its work on technologies to hinder device fingerprinting via Chrome, including by reducing how much technical information a Chrome browser provides to websites it visits.

A Google spokesman declined to comment further.

Brian Lesser, chief executive of InfoSum Ltd., a data services company, and former chief executive of AT&T Inc.’s digital-ad company Xandr, said Google’s “intentions are noble in the sense that they want to protect consumer data. Floc is one idea and I think it needs to exist within a range of different alternatives to cookies.”

Medallia is the best of the pack in the Customer Experience in Martech

Medallia Experience Cloud is a customer feedback management (CFM) solution that  

consolidates real-time data into a single platform that can be customized and scaled for  each unique business unit. To better understand the benefits, costs, and risk associated  with Medallia Experience Cloud, Forrester Consulting conducted a Total Economic  Impact™ (TEI) study based on interviews with six customers with experience using  Medallia Experience Cloud as their customer feedback management platform. This  summary is based on a full TEI study, which can be downloaded here. 

Based on the TEI analysis, a representative organization deploying Medallia Experience  Cloud has experienced a quick payback period with the following three-year financial  impact: $35.6 million in benefits and $5.1 million in costs, a net present value (NPV) of  $30.4 million and an ROI of 591%. Readers can use this representative organization to  understand the economic impact of deploying Medallia Experience Cloud and apply or  adapt it to their own situation and experience. 

Quantified Benefits 

The following risk-adjusted quantified benefits are representative of those experienced by  the companies interviewed: 

Improved customer experience leading to an increase in net income of $20.1  million. Organizations were able to meet the needs of today’s customers by making  product and channel improvements informed by real-time customer feedback data from  the Medallia Experience Cloud. These improvements drove growth through an increase in  both customer retention and average basket size.  

Operational efficiencies resulting in a savings of $13.8 million. Organizations were  able to improve organizational operations by aligning business and strategic initiatives  with the insights gained from the Medallia Experience Cloud. Additionally, call volume to  service desks was significantly reduced due to its enabling of organizations to  systematically identify and reduce customer pain points. 

Cost avoidance of the previous solution. Organizations avoided the cost of running and  maintaining legacy solutions by moving to Medallia’s cloud-based platform.  

Unquantified Benefits 

Examples of additional benefits that the interviewed organizations mentioned as  significant but were not quantified for this study: 

Faster closed loop cycle. Organizations were able to more quickly close the loop with  customers through preferred channels and recognized a positive impact on both  customer churn and employee morale. › A shift in overall organizational culture towards CX. Employee access to real-time  customer feedback information brings the customer experience to life and helps keep the entire organization focused on meeting customer needs and expectations

Key Investment Drivers And Results 

S. Ernest Paul

Organizations shared the following challenges prior to Medallia Experience Cloud: 

Inability to provide meaningful CX insights. Surveys and other solutions provide  data points but do not facilitate the insights or analytics necessary to understand and  meet the needs of today’s customers. 

Inability to effectively handle organizational scale. Legacy vendors and solutions could not reliably process the necessary volume of data quickly or effectively, preventing organizations from moving quickly to reduce pain points and incorporating  customer feedback into strategic initiatives.

Organizations achieved key results with Medallia Experience Cloud: 

S. Ernest Paul

Develop actionable insights to improve CX and drive product improvements. A  VP of customer insights in the telecommunications industry stated: “We were able to  reduce the number of calls to our service desk, because we’ve addressed and  eliminated many of the reasons, the root causes, of why people were calling us:  technical reliability, product functionality, billing issues. This has categorically  changed the game for us.” 

Effective scaling and flexibility. Medallia’s experience with global deployments and  large enterprises made the implementation and scaling an easy, collaborative  process.  

Drive a shift in organizational culture. The ability for a single platform to collect all  CFM data, its accessibility to anyone in the organization, and the direct connection to  actions for closed loop feedback drives a shift in organizational culture towards  meeting and exceeding customer expectations. 

Composite Organization 

Based on the interviews, Forrester constructed a TEI framework, a composite  company, and an ROI analysis that illustrates the areas financially affected, covered in  greater detail in the full study. The composite organization has the following  characteristics:  

Description of composite. The composite is a global conglomerate with $9 billion in  annual revenue, growing at a rate of 2% year-over-year (YOY), prior to its investment in  Medallia. Its main revenue streams include membership fees and 750 retail stores,  along with other B2B and B2C lines of business (LOBs). It has 2 million total customers  and 18,000 employees, which includes 750 key accounts with 150 account managers,  along with 4,500 contact center agents.  

Deployment characteristics. The global conglomerate composite organization has  deployed the Medallia Experience Cloud for transactional and relationship surveys,  both internally-facing (employees) and externally-facing (customers). The composite  organization initially rolled out Medallia’s Best Practices Package for Retail to its 750  stores. For Years 2 and 3, the program expanded to cover the full enterprise, including  the 4,500 contact center agents and 150 key accounts. 

Economic Impact

Increased income due to improved customer experience. Using customer feedback  obtained through the Medallia platform, organizations were able to make product  improvements leading to additional sales, and website improvements to remove pain  points and barriers to purchases. Organizations were able to improve overall NPS,  realized an increase in customer retention rate as well as increased average basket  size per customer leading to an increase in income of $20.1 million. 

Operational efficiencies represent $13.8 million in savings. Unification of data  across the organization and throughout the customer lifecycle allowed organizations to  better define strategic initiatives, better focus resources on those initiatives and reduce  overall service desk tickets by analyzing customer feedback to reduce pain points. 

Cost avoidance of previous solution. Organizations noted a total of $1.7 million in  cost savings related to licensing and management of previous CFM solution. 

Unquantified benefits. These are some of the benefits not quantified in the financial  analysis but were mentioned as significant. 

Faster closed loop cycle. A senior director of customer insights in the retail industry  told Forrester: “Medallia’s closed loop feedback mechanism has proven to be  effective, along with the simplicity of the dashboards.” 

A shift in organizational culture towards CX. Exposing employees to real-time  customer feedback promotes a deeper understanding of customer needs and  expectations, and enables organizations to introduce solutions that have a  meaningful impact on Customer Experience. 

Medallia does not come cheap

S. Ernest Paul

Costs 

The composite organization experienced two cost categories associated with the Medallia  Experience Cloud investment. Over three years, the composite organization expects risk adjusted total costs to have a PV of $5.1 million. 

Medallia Costs of $2.6 million. The organization paid Medallia for an annual software subscription, ongoing managed services, and implementation services for  initial and Year 1 costs. 

Internal costs of $2.6 million. These include day-to-day management,  implementation costs, and training associated with the new platform

The costs could go up id the add-ons are utilized

The value of flexibility is clearly unique to each customer, and the measure of its value  varies from organization to organization. There are multiple scenarios in which a  customer might choose to implement Medallia Experience Cloud and later realize  additional uses and business opportunities, for example:  

Ability to adapt the CFM solution to fit evolving needs. Medallia is willing to work  with clients to find solutions or benchmarks to unique business challenges. 

A/B testing. The Medallia Experience Cloud allows customers to quickly A/B test  customer experience solutions or pilot business process changes before rollout.  

Text analytics search. Changes in customer sentiment can occur quickly, and the text  analytics feature allows organizations to identify and adapt to unexpected developments. 

Financial Summary 

Flexibility, as defined by TEI,  represents an investment in  additional capacity or capability  that could be turned into business  benefit for a future additional  investment. This provides an  organization with the “right” or the  ability to engage in future initiatives  but not the obligation to do so.
SUMMMARY

The financial results calculated in the Benefits and Costs sections can be used to determine the ROI, NPV, and  payback period for the composite organization’s investment in Medallia Experience Cloud. Forrester assumes a  yearly discount rate of 10% for this analysis. 

COURTESY: FORRESTER

The Big Healthcare Fix

In health care, the days of business as usual are over. Around the world, every health care system is struggling with rising costs and uneven quality despite the hard work of well-intentioned, well-trained clinicians. Health care leaders and policy makers have tried countless incremental fixes—attacking fraud, reducing errors, enforcing practice guidelines, making patients better “consumers,” implementing electronic medical records—but none have had much impact.

It’s time for a fundamentally new strategy.

S. Ernest Paul

At its core is maximizing value for patients: that is, achieving the best outcomes at the lowest cost. We must move away from a supply-driven health care system organized around what physicians do and toward a patient-centered system organized around what patients need. We must shift the focus from the volume and profitability of services provided—physician visits, hospitalizations, procedures, and tests—to the patient outcomes achieved. And we must replace today’s fragmented system, in which every local provider offers a full range of services, with a system in which services for particular medical conditions are concentrated in health-delivery organizations and in the right locations to deliver high-value care.

Making this transformation is not a single step but an overarching strategy. We call it the “value agenda.” It will require restructuring how health care delivery is organized, measured, and reimbursed. In 2006, Michael Porter and Elizabeth Teisberg introduced the value agenda in their book Redefining Health Care. Since then, through our research and the work of thousands of health care leaders and academic researchers around the world, the tools to implement the agenda have been developed, and their deployment by providers and other organizations is rapidly spreading.

The transformation to value-based health care is well under way. Some organizations are still at the stage of pilots and initiatives in individual practice areas. Other organizations, such as the Cleveland Clinic and Germany’s Schön Klinik, have undertaken large-scale changes involving multiple components of the value agenda. The result has been striking improvements in outcomes and efficiency, and growth in market share.

There is no longer any doubt about how to increase the value of care. The question is, which organizations will lead the way and how quickly can others follow? The challenge of becoming a value-based organization should not be underestimated, given the entrenched interests and practices of many decades. This transformation must come from within. Only physicians and provider organizations can put in place the set of interdependent steps needed to improve value, because ultimately value is determined by how medicine is practiced. Yet every other stakeholder in the health care system has a role to play. Patients, health plans, employers, and suppliers can hasten the transformation—and all will benefit greatly from doing so.

Defining the Goal

The first step in solving any problem is to define the proper goal. Efforts to reform health care have been hobbled by lack of clarity about the goal, or even by the pursuit of the wrong goal. Narrow goals such as improving access to care, containing costs, and boosting profits have been a distraction. Access to poor care is not the objective, nor is reducing cost at the expense of quality. Increasing profits is today misaligned with the interests of patients, because profits depend on increasing the volume of services, not delivering good results.

In health care, the overarching goal for providers, as well as for every other stakeholder, must be improving value for patients, where value is defined as the health outcomes achieved that matter to patients relative to the cost of achieving those outcomes. Improving value requires either improving one or more outcomes without raising costs or lowering costs without compromising outcomes, or both. Failure to improve value means, well, failure.

Embracing the goal of value at the senior management and board levels is essential, because the value agenda requires a fundamental departure from the past. While health care organizations have never been against improving outcomes, their central focus has been on growing volumes and maintaining margins. Despite noble mission statements, the real work of improving value is left undone. Legacy delivery approaches and payment structures, which have remained largely unchanged for decades, have reinforced the problem and produced a system with erratic quality and unsustainable costs.

All this is now changing. Facing severe pressure to contain costs, payors are aggressively reducing reimbursements and finally moving away from fee-for-service and toward performance-based reimbursement. In the U.S., an increasing percentage of patients are being covered by Medicare and Medicaid, which reimburse at a fraction of private-plan levels. These pressures are leading more independent hospitals to join health systems and more physicians to move out of private practice and become salaried employees of hospitals. (For more, see the sidebar “Why Change Now?”) The transition will be neither linear nor swift, and we are entering a prolonged period during which providers will work under multiple payment models with varying exposure to risk.

Why Change Now?

Most hospitals and physician groups still have positive margins, but the pressure to consider a new strategic …

In this environment, providers need a strategy that transcends traditional cost reduction and responds to new payment models. If providers can improve patient outcomes, they can sustain or grow their market share. If they can improve the efficiency of providing excellent care, they will enter any contracting discussion from a position of strength. Those providers that increase value will be the most competitive. Organizations that fail to improve value, no matter how prestigious and powerful they seem today, are likely to encounter growing pressure. Similarly, health insurers that are slow to embrace and support the value agenda—by failing, for example, to favor high-value providers—will lose subscribers to those that do.

The Strategy for Value Transformation

The strategic agenda for moving to a high-value health care delivery system has six components. They are interdependent and mutually reinforcing; as we will see, progress will be easiest and fastest if they are advanced together. (See the exhibit “The Value Agenda.”)

The Value Agenda

The strategic agenda for moving to a high-value health care delivery system has six components. They are …

The current structure of health care delivery has been sustained for decades because it has rested on its own set of mutually reinforcing elements: organization by specialty with independent private-practice physicians; measurement of “quality” defined as process compliance; cost accounting driven not by costs but by charges; fee-for-service payments by specialty with rampant cross-subsidies; delivery systems with duplicative service lines and little integration; fragmentation of patient populations such that most providers do not have critical masses of patients with a given medical condition; siloed IT systems around medical specialties; and others. This interlocking structure explains why the current system has been so resistant to change, why incremental steps have had little impact (see the sidebar “No Magic Bullets”), and why simultaneous progress on multiple components of the strategic agenda is so beneficial.

No Magic Bullets

The history of health care reform has featured a succession of narrow “solutions,” many imposed on …

The components of the strategic agenda are not theoretical or radical. All are already being implemented to varying degrees in organizations ranging from leading academic medical centers to community safety-net hospitals. No organization, however, has yet put in place the full value agenda across its entire practice. Every organization has room for improvement in value for patients—and always will.

1: Organize into Integrated Practice Units (IPUs)

At the core of the value transformation is changing the way clinicians are organized to deliver care. The first principle in structuring any organization or business is to organize around the customer and the need. In health care, that requires a shift from today’s siloed organization by specialty department and discrete service to organizing around the patient’s medical condition. We call such a structure an integrated practice unit. In an IPU, a dedicated team made up of both clinical and nonclinical personnel provides the full care cycle for the patient’s condition.

IPUs treat not only a disease but also the related conditions, complications, and circumstances that commonly occur along with it—such as kidney and eye disorders for patients with diabetes, or palliative care for those with metastatic cancer. IPUs not only provide treatment but also assume responsibility for engaging patients and their families in care—for instance, by providing education and counseling, encouraging adherence to treatment and prevention protocols, and supporting needed behavioral changes such as smoking cessation or weight loss.

In an IPU, personnel work together regularly as a team toward a common goal: maximizing the patient’s overall outcomes as efficiently as possible. They are expert in the condition, know and trust one another, and coordinate easily to minimize wasted time and resources. They meet frequently, formally and informally, and review data on their own performance. Armed with those data, they work to improve care—by establishing new protocols and devising better or more efficient ways to engage patients, including group visits and virtual interactions. Ideally, IPU members are co-located, to facilitate communication, collaboration, and efficiency for patients, but they work as a team even if they’re based at different locations. (See the sidebar “What Is an Integrated Practice Unit?”)

What Is an Integrated Practice Unit?

1) An IPU is organized around a medical condition or a set of closely related conditions (or around defined patient …

Take, for example, care for patients with low back pain—one of the most common and expensive causes of disability. In the prevailing approach, patients receive portions of their care from a variety of types of clinicians, usually in several different locations, who function more like a spontaneously assembled “pickup team” than an integrated unit. One patient might begin care with a primary care physician, while others might start with an orthopedist, a neurologist, or a rheumatologist. What happens next is unpredictable. Patients might be referred to yet another physician or to a physical therapist. They might undergo radiology testing (this could happen at any point—even before seeing a physician). Each encounter is separate from the others, and no one coordinates the care. Duplication of effort, delays, and inefficiency is almost inevitable. Since no one measures patient outcomes, how long the process takes, or how much the care costs, the value of care never improves.

The impact on value of IPUs is striking. Compared with regional averages, patients at Virginia Mason’s Spine Clinic miss fewer days of work (4.3 versus 9 per episode) and need fewer physical therapy visits (4.4 versus 8.8).

Contrast that with the approach taken by the IPU at Virginia Mason Medical Center, in Seattle. Patients with low back pain call one central phone number (206-41-SPINE), and most can be seen the same day. The “spine team” pairs a physical therapist with a physician who is board-certified in physical medicine and rehabilitation, and patients usually see both on their first visit. Those with serious causes of back pain (such as a malignancy or an infection) are quickly identified and enter a process designed to address the specific diagnosis. Other patients will require surgery and will enter a process for that. For most patients, however, physical therapy is the most effective next intervention, and their treatment often begins the same day.

Virginia Mason did not address the problem of chaotic care by hiring coordinators to help patients navigate the existing system—a “solution” that does not work. Rather, it eliminated the chaos by creating a new system in which caregivers work together in an integrated way. The impact on value has been striking. Compared with regional averages, patients at Virginia Mason’s Spine Clinic miss fewer days of work (4.3 versus 9 per episode) and need fewer physical therapy visits (4.4 versus 8.8). In addition, the use of MRI scans to evaluate low back pain has decreased by 23% since the clinic’s launch, in 2005, even as outcomes have improved. Better care has actually lowered costs, a point we will return to later. Virginia Mason has also increased revenue through increased productivity, rather than depending on more fee-for-service visits to drive revenue from unneeded or duplicative tests and care. The clinic sees about 2,300 new patients per year compared with 1,404 under the old system, and it does so in the same space and with the same number of staff members.

Wherever IPUs exist, we find similar results—faster treatment, better outcomes, lower costs, and, usually, improving market share in the condition. But those results can be achieved only through a restructuring of work. Simply co-locating staff in the same building, or putting up a sign announcing a Center of Excellence or an Institute, will have little impact.

IPUs emerged initially in the care for particular medical conditions, such as breast cancer and joint replacement. Today, condition-based IPUs are proliferating rapidly across many areas of acute and chronic care, from organ transplantation to shoulder care to mental health conditions such as eating disorders.

Recently, we have applied the IPU model to primary care (see Michael E. Porter, Erika A. Pabo, and Thomas H. Lee, “Redesigning Primary Care,” Health Affairs, March 2013). By its very nature, primary care is holistic, concerned with all the health circumstances and needs of a patient. Today’s primary care practice applies a common organizational structure to the management of a very wide range of patients, from healthy adults to the frail elderly. The complexity of meeting their heterogeneous needs has made value improvement very difficult in primary care—for example, heterogeneous needs make outcomes measurement next to impossible.

In primary care, IPUs are multidisciplinary teams organized to serve groups of patients with similar primary and preventive care needs—for example, patients with complex chronic conditions such as diabetes, or disabled elderly patients. Different patient groups require different teams, different types of services, and even different locations of care. They also require services to address head-on the crucial role of lifestyle change and preventive care in outcomes and costs, and those services must be tailored to patients’ overall circumstances. Within each patient group, the appropriate clinical team, preventive services, and education can be put in place to improve value, and results become measureable.

This approach is already starting to be applied to high-risk, high-cost patients through so-called Patient-Centered Medical Homes. But the opportunity to substantially enhance value in primary care is far broader. At Geisinger Health System, in Pennsylvania, for example, the care for patients with chronic conditions such as diabetes and heart disease involves not only physicians and other clinicians but also pharmacists, who have major responsibility for following and adjusting medications. The inclusion of pharmacists on teams has resulted in fewer strokes, amputations, emergency department visits, and hospitalizations, and in better performance on other outcomes that matter to patients.

2: Measure Outcomes and Costs for Every Patient

Rapid improvement in any field requires measuring results—a familiar principle in management. Teams improve and excel by tracking progress over time and comparing their performance to that of peers inside and outside their organization. Indeed, rigorous measurement of value (outcomes and costs) is perhaps the single most important step in improving health care. Wherever we see systematic measurement of results in health care—no matter what the country—we see those results improve.

Yet the reality is that the great majority of health care providers (and insurers) fail to track either outcomes or costs by medical condition for individual patients. For example, although many institutions have “back pain centers,” few can tell you about their patients’ outcomes (such as their time to return to work) or the actual resources used in treating those patients over the full care cycle. That surprising truth goes a long way toward explaining why decades of health care reform have not changed the trajectory of value in the system.

When outcomes measurement is done, it rarely goes beyond tracking a few areas, such as mortality and safety. Instead, “quality measurement” has gravitated to the most easily measured and least controversial indicators. Most “quality” metrics do not gauge quality; rather, they are process measures that capture compliance with practice guidelines. HEDIS (the Healthcare Effectiveness Data and Information Set) scores consist entirely of process measures as well as easy-to-measure clinical indicators that fall well short of actual outcomes. For diabetes, for example, providers measure the reliability of their LDL cholesterol checks and hemoglobin A1c levels, even though what really matters to patients is whether they are likely to lose their vision, need dialysis, have a heart attack or stroke, or undergo an amputation. Few health care organizations yet measure how their diabetic patients fare on all the outcomes that matter.

It is not surprising that the public remains indifferent to quality measures that may gauge a provider’s reliability and reputation but say little about how its patients actually do. The only true measures of quality are the outcomes that matter to patients. And when those outcomes are collected and reported publicly, providers face tremendous pressure—and strong incentives—to improve and to adopt best practices, with resulting improvements in outcomes. Take, for example, the Fertility Clinic Success Rate and Certification Act of 1992, which mandated that all clinics performing assisted reproductive technology procedures, notably in vitro fertilization, provide their live birth rates and other metrics to the Centers for Disease Control. After the CDC began publicly reporting those data, in 1997, improvements in the field were rapidly adopted, and success rates for all clinics, large and small, have steadily improved. (See the exhibit “Outcomes Measurement and Reporting Drive Improvement.”)

Outcomes Measurement and Reporting Drive Improvement

Since public reporting of clinic performance began, in 1997, in vitro fertilization success rates have climbed steadily across all …

Measuring outcomes that matter to patients.

Outcomes should be measured by medical condition (such as diabetes), not by specialty (podiatry) or intervention (eye examination). Outcomes should cover the full cycle of care for the condition, and track the patient’s health status after care is completed. The outcomes that matter to patients for a particular medical condition fall into three tiers. (For more, see Michael Porter’s article “Measuring Health Outcomes: The Outcome Hierarchy,” New England Journal of Medicine, December 2010.) Tier 1 involves the health status achieved. Patients care about mortality rates, of course, but they’re also concerned about their functional status. In the case of prostate cancer treatment, for example, five-year survival rates are typically 90% or higher, so patients are more interested in their providers’ performance on crucial functional outcomes, such as incontinence and sexual function, where variability among providers is much greater.

Outcomes That Matter to Patients: A Hierarchy

In measuring quality of care, providers tend to focus on only what they directly control or easily measured clinical …

Tier 2 outcomes relate to the nature of the care cycle and recovery. For example, high readmission rates and frequent emergency-department “bounce backs” may not actually worsen long-term survival, but they are expensive and frustrating for both providers and patients. The level of discomfort during care and how long it takes to return to normal activities also matter greatly to patients. Significant delays before seeing a specialist for a potentially ominous complaint can cause unnecessary anxiety, while delays in commencing treatment prolong the return to normal life. Even when functional outcomes are equivalent, patients whose care process is timely and free of chaos, confusion, and unnecessary setbacks experience much better care than those who encounter delays and problems along the way.

Tier 3 outcomes relate to the sustainability of health. A hip replacement that lasts two years is inferior to one that lasts 15 years, from both the patient’s perspective and the provider’s.

Measuring the full set of outcomes that matter is indispensable to better meeting patients’ needs. It is also one of the most powerful vehicles for lowering health care costs. If Tier 1 functional outcomes improve, costs invariably go down. If any Tier 2 or 3 outcomes improve, costs invariably go down. A 2011 German study, for example, found that one-year follow-up costs after total hip replacement were 15% lower in hospitals with above-average outcomes than in hospitals with below-average outcomes, and 24% lower than in very-low-volume hospitals, where providers have relatively little experience with hip replacements. By failing to consistently measure the outcomes that matter, we lose perhaps our most powerful lever for cost reduction.

Over the past half dozen years, a growing array of providers have begun to embrace true outcome measurement. Many of the leaders have seen their reputations—and market share—improve as a result. A welcomed competition is emerging to be the most comprehensive and transparent provider in measuring outcomes.

The Cleveland Clinic is one such pioneer, first publishing its mortality data on cardiac surgery and subsequently mandating outcomes measurement across the entire organization. Today, the Clinic publishes 14 different “outcomes books” reporting performance in managing a growing number of conditions (cancer, neurological conditions, and cardiac diseases, for example). The range of outcomes measured remains limited, but the Clinic is expanding its efforts, and other organizations are following suit. At the individual IPU level, numerous providers are beginning efforts. At Dartmouth-Hitchcock’s Spine Center, for instance, patient scores for pain, physical function, and disability for surgical and nonsurgical treatment at three, six, 12, and 24 months are now published for each type of low back disorder.

Providers are improving their understanding of what outcomes to measure and how to collect, analyze, and report outcomes data. For example, some of our colleagues at Partners HealthCare in Boston are testing innovative technologies such as tablet computers, web portals, and telephonic interactive systems for collecting outcomes data from patients after cardiac surgery or as they live with chronic conditions such as diabetes. Outcomes are also starting to be incorporated in real time into the process of care, allowing providers to track progress as they interact with patients.

To accelerate comprehensive and standardized outcome measurement on a global basis, we recently cofounded the International Consortium for Health Outcomes Measurement. ICHOM develops minimum outcome sets by medical condition, drawing on international registries and provider best practices. It brings together clinical leaders from around the world to develop standard outcome sets, while also gathering and disseminating best practices in outcomes data collection, verification, and reporting. Just as railroads converged on standard track widths and the telecommunications industry on standards to allow data exchange, health care providers globally should consistently measure outcomes by condition to enable universal comparison and stimulate rapid improvement.

Measuring the cost of care.

For a field in which high cost is an overarching problem, the absence of accurate cost information in health care is nothing short of astounding. Few clinicians have any knowledge of what each component of care costs, much less how costs relate to the outcomes achieved. In most health care organizations there is virtually no accurate information on the cost of the full cycle of care for a patient with a particular medical condition. Instead, most hospital cost-accounting systems are department-based, not patient-based, and designed for billing of transactions reimbursed under fee-for-service contracts. In a world where fees just keep going up, that makes sense. Existing systems are also fine for overall department budgeting, but they provide only crude and misleading estimates of actual costs of service for individual patients and conditions. For example, cost allocations are often based on charges, not actual costs. As health care providers come under increasing pressure to lower costs and report outcomes, the existing systems are wholly inadequate.

Existing costing systems are fine for overall department budgeting, but they provide only crude and misleading estimates of actual costs of service for individual patients and conditions.

To determine value, providers must measure costs at the medical condition level, tracking the expenses involved in treating the condition over the full cycle of care. This requires understanding the resources used in a patient’s care, including personnel, equipment, and facilities; the capacity cost of supplying each resource; and the support costs associated with care, such as IT and administration. Then the cost of caring for a condition can be compared with the outcomes achieved.

The best method for understanding these costs is time-driven activity-based costing, TDABC. While rarely used in health care to date, it is beginning to spread. Where TDABC is being applied, it is helping providers find numerous ways to substantially reduce costs without negatively affecting outcomes (and sometimes even improving them). Providers are achieving savings of 25% or more by tapping opportunities such as better capacity utilization, more-standardized processes, better matching of personnel skills to tasks, locating care in the most cost-effective type of facility, and many others.

For example, Virginia Mason found that it costs $4 per minute for an orthopedic surgeon or other procedural specialist to perform a service, $2 for a general internist, and $1 or less for a nurse practitioner or physical therapist. In light of those cost differences, focusing the time of the most expensive staff members on work that utilizes their full skill set is hugely important. (For more, see Robert Kaplan and Michael Porter’s article “How to Solve the Cost Crisis in Health Care,” HBR September 2011.)

Without understanding the true costs of care for patient conditions, much less how costs are related to outcomes, health care organizations are flying blind in deciding how to improve processes and redesign care. Clinicians and administrators battle over arbitrary cuts, rather than working together to improve the value of care. Because proper cost data are so critical to overcoming the many barriers associated with legacy processes and systems, we often tell skeptical clinical leaders: “Cost accounting is your friend.” Understanding true costs will finally allow clinicians to work with administrators to improve the value of care—the fundamental goal of health care organizations.

3: Move to Bundled Payments for Care Cycles

Neither of the dominant payment models in health care—global capitation and fee-for-service—directly rewards improving the value of care. Global capitation, a single payment to cover all of a patient’s needs, rewards providers for spending less but not specifically for improving outcomes or value. It also decouples payment from what providers can directly control. Fee-for-service couples payment to something providers can control—how many of their services, such as MRI scans, they provide—but not to the overall cost or the outcomes. Providers are rewarded for increasing volume, but that does not necessarily increase value.

The payment approach best aligned with value is a bundled payment that covers the full care cycle for acute medical conditions, the overall care for chronic conditions for a defined period (usually a year), or primary and preventive care for a defined patient population (healthy children, for instance). Well-designed bundled payments directly encourage teamwork and high-value care. Payment is tied to overall care for a patient with a particular medical condition, aligning payment with what the team can control. Providers benefit from improving efficiency while maintaining or improving outcomes.

Sound bundled payment models should include: severity adjustments or eligibility only for qualifying patients; care guarantees that hold the provider responsible for avoidable complications, such as infections after surgery; stop-loss provisions that mitigate the risk of unusually high-cost events; and mandatory outcomes reporting.

Governments, insurers, and health systems in multiple countries are moving to adopt bundled payment approaches. For example, the Stockholm County Council initiated such a program in 2009 for all total hip and knee replacements for relatively healthy patients. The result was lower costs, higher patient satisfaction, and improvement in some outcomes. In Germany, bundled payments for hospital inpatient care—combining all physician fees and other costs, unlike payment models in the U.S.—have helped keep the average payment for a hospitalization below $5,000 (compared with more than $19,000 in the U.S., even though hospital stays are, on average, 50% longer in Germany). Among the features of the German system are care guarantees under which the hospital bears responsibility for the cost of rehospitalization related to the original care.

In the U.S., bundled payments have become the norm for organ transplant care. Here, mandatory outcomes reporting has combined with bundles to reinforce team care, speed diffusion of innovation, and rapidly improve outcomes. Providers that adopted bundle approaches early benefitted. UCLA’s kidney transplant program, for example, has grown dramatically since pioneering a bundled price arrangement with Kaiser Permanente, in 1986, and offering the payment approach to all its payors shortly thereafter. Its outcomes are among the best nationally, and UCLA’s market share in organ transplantation has expanded substantially.

Employers are also embracing bundled payments. This year, Walmart introduced a program in which it encourages employees who need cardiac, spine, and selected other surgery to obtain care at one of just six providers nationally, all of which have high volume and track records of excellent outcomes: the Cleveland Clinic, Geisinger, the Mayo Clinic, Mercy Hospital (in Springfield, Missouri), Scott & White, and Virginia Mason. The hospitals are reimbursed for the care with a single bundled payment that includes all physician and hospital costs associated with both inpatient and outpatient pre- and post-operative care. Employees bear no out-of-pocket costs for their care—travel, lodging, and meals for the patient and a caregiver are provided—as long as the surgery is performed at one of the centers of excellence. The program is in its infancy, but expectations are that Walmart and other large employers will expand such programs to improve value for their employees, and will step up the incentives for employees to use them. Sophisticated employers have learned that they must move beyond cost containment and health promotion measures, such as co-pays and on-site health and wellness facilities, and become a greater force in rewarding high-value providers with more patients.

As bundled payment models proliferate, the way in which care is delivered will be transformed. Consider how providers participating in Walmart’s program are changing the way they provide care. As clinical leaders map the processes involved in caring for patients who live outside their immediate area, they are learning how to better coordinate care with all of patients’ local physicians. They’re also questioning existing practices. For example, many hospitals routinely have patients return to see the cardiac surgeon six to eight weeks after surgery, but out-of-town visits seem difficult to justify for patients with no obvious complications. In deciding to drop those visits, clinicians realized that maybe local patients do not need routine postoperative visits either.

Providers remain nervous about bundled payments, citing concerns that patient heterogeneity might not be fully reflected in reimbursements, and that the lack of accurate cost data at the condition level could create financial exposure. Those concerns are legitimate, but they are present in any reimbursement model. We believe that concerns will fall away over time, as sophistication grows and the evidence mounts that embracing payments aligned with delivering value is in providers’ economic interest. Providers will adopt bundles as a tool to grow volume and improve value.

4: Integrate Care Delivery Systems

A large and growing proportion of health care is provided by multisite health care delivery organizations. In 2011, 60% of all U.S. hospitals were part of such systems, up from 51% in 1999. Multisite health organizations accounted for 69% of total admissions in 2011. Those proportions are even higher today. Unfortunately, most multisite organizations are not true delivery systems, at least thus far, but loose confederations of largely stand-alone units that often duplicate services. There are huge opportunities for improving value as providers integrate systems to eliminate the fragmentation and duplication of care and to optimize the types of care delivered in each location.

To achieve true system integration, organizations must grapple with four related sets of choices: defining the scope of services, concentrating volume in fewer locations, choosing the right location for each service line, and integrating care for patients across locations. The politics of redistributing care remain daunting, given most providers’ instinct to preserve the status quo and protect their turf. Some acid-test questions to gauge board members’ and health system leaders’ appetite for transformation include: Are you ready to give up service lines to improve the value of care for patients? Is relocating service lines on the table?

Define the scope of services.

A starting point for system integration is determining the overall scope of services a provider can effectively deliver—and reducing or eliminating service lines where they cannot realistically achieve high value. For community providers, this may mean exiting or establishing partnerships in complex service lines, such as cardiac surgery or care for rare cancers. For academic medical centers, which have more heavily resourced facilities and staff, this may mean minimizing routine service lines and creating partnerships or affiliations with lower-cost community providers in those fields. Although limiting the range of service lines offered has traditionally been an unnatural act in health care—where organizations strive to do everything for everyone—the move to a value-based delivery system will require those kinds of choices.

Concentrate volume in fewer locations.

Second, providers should concentrate the care for each of the conditions they do treat in fewer locations. The stated promise of consumer-oriented health care—“We do everything you need close to your home or workplace”—has been a good marketing pitch but a poor strategy for creating value. Concentrating volume is essential if integrated practice units are to form and measurement is to improve.

Numerous studies confirm that volume in a particular medical condition matters for value. Providers with significant experience in treating a given condition have better outcomes, and costs improve as well. A recent study of the relationship between hospital volume and operative mortality for high-risk types of cancer surgery, for example, found that as hospital volumes rose, the chances of a patient’s dying as a result of the surgery fell by as much as 67%. Patients, then, are often much better off traveling longer distance to obtain care at locations where there are teams with deep experience in their condition. That often means driving past the closest hospitals.

Organizations that progress rapidly in adopting the value agenda will reap huge benefits, even if regulatory change is slow.

Concentrating volume is among the most difficult steps for many organizations, because it can threaten both prestige and physician turf. Yet the benefits of concentration can be game-changing. In 2009, the city of London set out to improve survival and prospects for stroke patients by ensuring that patients were cared for by true IPUs—dedicated, state-of-the-art teams and facilities including neurologists who were expert in the care of stroke. These were called hyper-acute stroke units, or HASUs. At the time, there were too many hospitals providing acute stroke care in London (32 of them) to allow any to amass a high volume. UCL Partners, a delivery system comprising six well-known teaching hospitals that serve North Central London, had two hospitals providing stroke care—University College London Hospital and the Royal Free Hospital—located less than three miles apart. University College was selected to house the new stroke unit. Neurologists at Royal Free began practicing at University College, and a Royal Free neurologist was appointed as the overall leader of the stroke program. UCL Partners later moved all emergency vascular surgery and complex aortic surgery to Royal Free.

These steps sent a strong message that UCL Partners was ready to concentrate volume to improve value. The number of stroke cases treated at University College climbed from about 200 in 2008 to more than 1,400 in 2011. All stroke patients can now undergo rapid evaluation by highly experienced neurologists and begin their recovery under the care of nurses who are expert in preventing stroke-related complications. Since the shift, mortality associated with strokes at University College has fallen by about 25% and costs per patient have dropped by 6%.

Choose the right location for each service.

The third component of system integration is delivering particular services at the locations at which value is highest. Less complex conditions and routine services should be moved out of teaching hospitals into lower-cost facilities, with charges set accordingly. There are huge value improvement opportunities in matching the complexity and skills needed with the resource intensity of the location, which will not only optimize cost but also increase staff utilization and productivity. Children’s Hospital of Philadelphia, for instance, decided to stop performing routine tympanostomies (placing tubes into children’s eardrums to reduce fluid collection and risk of infection) at its main facility and shifted those services to suburban ambulatory surgery facilities. More recently, the hospital applied the same approach to simple hypospadias repairs, a urological procedure. Relocating such services cut costs and freed up operating rooms and staff at the teaching hospital for more-complex procedures. Management estimated the total cost reduction resulting from the shift at 30% to 40%.

In many cases, current reimbursement schemes still reward providers for performing services in a hospital setting, offering even higher payments if the hospital is an academic medical center—another example of how existing reimbursement models have worked against value. But the days of charging higher fees for routine services in high-cost settings are quickly coming to an end. (See again the sidebar “Why Change Now?”)

Integrate care across locations.

The final component of health system integration is to integrate care for individual patients across locations. As providers distribute services in the care cycle across locations, they must learn to tie together the patient’s care across these sites. Care should be directed by IPUs, but recurring services need not take place in a single location. For example, patients with low back pain may receive an initial evaluation, and surgery if needed, from a centrally located spine IPU team but may continue physical therapy closer to home. Wherever the services are performed, however, the IPU manages the full care cycle. Integrating mechanisms, such as assigning a single physician team captain for each patient and adopting common scheduling and other protocols, help ensure that well-coordinated, multidisciplinary care is delivered in a cost-effective and convenient way.

5: Expand Geographic Reach

Health care delivery remains heavily local, and even academic medical centers primarily serve their immediate geographic areas. If value is to be substantially increased on a large scale, however, superior providers for particular medical conditions need to serve far more patients and extend their reach through the strategic expansion of excellent IPUs. Buying full-service hospitals or practices in new geographic areas is rarely the answer. Geographic expansion should focus on improving value, not just increasing volume.

Targeted geographic expansion by leading providers is rapidly increasing, with dozens of organizations such as Vanderbilt, Texas Children’s, Children’s Hospital of Philadelphia, MD Anderson Cancer Center, and many others taking bold steps to serve patients over a wide geographic area.

Geographic expansion takes two principle forms. The first is a hub-and-spoke model. For each IPU, satellite facilities are established and staffed at least partly by clinicians and other personnel employed by the parent organization. In the most effective models, some clinicians rotate among locations, which helps staff members across all facilities feel they are part of the team. As expansion moves to an entirely new region, a new IPU hub is built or acquired.

Patients often get their initial evaluation and development of a treatment plan at the hub, but some or much care takes place at more-convenient (and cost-effective) locations. Satellites deliver less complicated care, with complex cases referred to the hub. If complications occur whose effective management is beyond the ability of the satellite facility, the patient’s care is transferred to the hub. The net result is a substantial increase in the number of patients an excellent IPU can serve.

This model is becoming more common among leading cancer centers. MD Anderson, for example, has four satellite sites in the greater Houston region where patients receive chemotherapy, radiation therapy, and, more recently, low-complexity surgery, under the supervision of a hub IPU. The cost of care at the regional facilities is estimated to be about one-third less than comparable care at the main facility. By 2012, 22% of radiation treatment and 15% of all chemotherapy treatment were performed at regional sites, along with about 5% of surgery. Senior management estimates that 50% of comparable care currently still performed at the hub could move to satellite sites—a significant untapped value opportunity.

The second emerging geographic expansion model is clinical affiliation, in which an IPU partners with community providers or other local organizations, using their facilities rather than adding capacity. The IPU provides management oversight for clinical care, and some clinical staff members working at the affiliate may be employed by the parent IPU. MD Anderson uses this approach in its partnership with Banner Phoenix. Hybrid models include the approach taken by MD Anderson in its regional satellite program, which leases outpatient facilities located on community hospital campuses and utilizes those hospitals’ operating rooms and other inpatient and ancillary services as needed.

Local affiliates benefit from the expertise, experience, and reputation of the parent IPU—benefits that often improve their market share locally. The IPU broadens its regional reach and brand, and benefits from management fees, shared revenue or joint venture income, and referrals of complex cases.

The Cleveland Clinic’s Heart and Vascular Institute, a pioneering IPU in cardiac and vascular care, has 19 hospital affiliates spanning the Eastern seaboard. Successful clinical affiliations such as these are robust—not simply storefronts with new signage and marketing campaigns—and involve close oversight by physician and nurse leaders from the parent organization as well as strict adherence to its practice models and measurement systems. Over time, outcomes for standard cases at the Clinic’s affiliates have risen to approach its own outcomes.

Vanderbilt’s rapidly expanding affiliate network illustrates the numerous opportunities that arise from affiliations that recognize each partner’s areas of strength. For example, Vanderbilt has encouraged affiliates to grow noncomplex obstetrics services that once might have taken place at the academic medical center, while affiliates have joint ventured with Vanderbilt in providing care for some complex conditions in their territories.

6: Build an Enabling Information Technology Platform

The preceding five components of the value agenda are powerfully enabled by a sixth: a supporting information technology platform. Historically, health care IT systems have been siloed by department, location, type of service, and type of data (for instance, images). Often IT systems complicate rather than support integrated, multidisciplinary care. That’s because IT is just a tool; automating broken service-delivery processes only gets you more-efficient broken processes. But the right kind of IT system can help the parts of an IPU work with one another, enable measurement and new reimbursement approaches, and tie the parts of a well-structured delivery system together.

A value-enhancing IT platform has six essential elements:

It is centered on patients.

The system follows patients across services, sites, and time for the full cycle of care, including hospitalization, outpatient visits, testing, physical therapy, and other interventions. Data are aggregated around patients, not departments, units, or locations.

It uses common data definitions.

Terminology and data fields related to diagnoses, lab values, treatments, and other aspects of care are standardized so that everyone is speaking the same language, enabling data to be understood, exchanged, and queried across the whole system.

It encompasses all types of patient data.

Physician notes, images, chemotherapy orders, lab tests, and other data are stored in a single place so that everyone participating in a patient’s care has a comprehensive view.

The medical record is accessible to all parties involved in care.

That includes referring physicians and patients themselves. A simple “stress test” question to gauge the accessibility of the data in an IT system is: Can visiting nurses see physicians’ notes, and vice versa? The answer today at almost all delivery systems is “no.” As different types of clinicians become true team members—working together in IPUs, for example—sharing information needs to become routine. The right kind of medical record also should mean that patients have to provide only one set of patient information, and that they have a centralized way to schedule appointments, refill prescriptions, and communicate with clinicians. And it should make it easy to survey patients about certain types of information relevant to their care, such as their functional status and their pain levels.

The system includes templates and expert systems for each medical condition.

Templates make it easier and more efficient for the IPU teams to enter and find data, execute procedures, use standard order sets, and measure outcomes and costs. Expert systems help clinicians identify needed steps (for example, follow-up for an abnormal test) and possible risks (drug interactions that may be overlooked if data are simply recorded in free text, for example).

The system architecture makes it easy to extract information.

In value-enhancing systems, the data needed to measure outcomes, track patient-centered costs, and control for patient risk factors can be readily extracted using natural language processing. Such systems also give patients the ability to report outcomes on their care, not only after their care is completed but also during care, to enable better clinical decisions. Even in today’s most advanced systems, the critical capability to create and extract such data remains poorly developed. As a result, the cost of measuring outcomes and costs is unnecessarily increased.

The Cleveland Clinic is a provider that has made its electronic record an important enabler of its strategy to put “Patients First” by pursuing virtually all these aims. It is now moving toward giving patients full access to clinician notes—another way to improve care for patients.

Getting Started

The six components of the value agenda are distinct but mutually reinforcing. Organizing into IPUs makes proper measurement of outcomes and costs easier. Better measurement of outcomes and costs makes bundled payments easier to set and agree upon. A common IT platform enables effective collaboration and coordination within IPU teams, while also making the extraction, comparison, and reporting of outcomes and cost data easier. With bundled prices in place, IPUs have stronger incentives to work as teams and to improve the value of care. And so on.

Implementing the value agenda is not a one-shot effort; it is an open-ended commitment. It is a journey that providers embark on, starting with the adoption of the goal of value, a culture of patients first, and the expectation of constant, measurable improvement. The journey requires strong leadership as well as a commitment to roll out all six value agenda components. For most providers, creating IPUs and measuring outcomes and costs should take the lead.

As should by now be clear, organizations that progress rapidly in adopting the value agenda will reap huge benefits, even if regulatory change is slow. As IPUs’ outcomes improve, so will their reputations and, therefore, their patient volumes. With the tools to manage and reduce costs, providers will be able to maintain economic viability even as reimbursements plateau and eventually decline. Providers that concentrate volume will drive a virtuous cycle, in which teams with more experience and better data improve value more rapidly—attracting still more volume. Superior IPUs will be sought out as partners of choice, enabling them to expand across their local regions and beyond.

Maintaining market share will be difficult for providers with nonemployed physicians if their inability to work together impedes progress in improving value. Hospitals with private-practice physicians will have to learn to function as a team to remain viable. Measuring outcomes is likely to be the first step in focusing everyone’s attention on what matters most.All stakeholders in health care have essential roles to play. (See the sidebar “Next Steps: Other Stakeholder Roles.”) Yet providers must take center stage. Their boards and senior leadership teams must have the vision and the courage to commit to the value agenda, and the discipline to progress through the inevitable resistance and disruptions that will result. Clinicians must prioritize patients’ needs and patient value over the desire to maintain their traditional autonomy and practice patterns.

Next Steps: Other Stakeholder Roles

The transformation to a high-value health care delivery system must come from within, with physicians and provider ...

Providers that cling to today’s broken system will become dinosaurs. Reputations that are based on perception, not actual outcomes, will fade. Maintaining current cost structures and prices in the face of greater transparency and falling reimbursement levels will be untenable. Those organizations—large and small, community and academic—that can master the value agenda will be rewarded with financial viability and the only kind of reputation that should matter in health care—excellence in outcomes and pride in the value they deliver.

CourtesyHBR.org