AI for Product Attributes

Leveraging AI at ItemMaster

In Blog, Platform Solutions, Technology Trends by Srinivasan M

Artificial Intelligence with Item MasterItemMaster, a cloud-based enhanced product-attribute platform, helps consumer packaged goods (CPG) companies bring their products to market for today’s digital shoppers. Content is the single biggest factor that drives positive customer experiences and sales and we deliver great content.

Our combination of domain expertise and artificial intelligence (AI) helps consumers achieve the goal of finding an item as opposed to searching for it, while driving a 10%+ lift in the sales of products with complete ItemMaster product content.

The Shift From Search to Find

The Food & Beverage category is flooded with choices: a lot of variety is necessary for all consumers to find acceptable matches. But the key question is, amid all these choices, how does the shopper find the products that meet his or her specific lifestyle or dietary requirements or desires? In today’s omni-channel digital setting, products that do not mesh with consumer preferences simply will not sell.

Content Enrichment

Rich product attributes are the way to bridge the supply on one side and demand on the other. Our coverage also extends to include the last-mile gap. The core attributes created for a product provide the foundation for good, clean content; however, CPGs should remove the gap in reaching the end consumer (which manifests as consumer experience) with enriched content. The enrichment of the content through lifestyle attributes, events and personas drives a fulfilling consumer experience.

We are also witnessing the convergence of consumer needs with technological solutions. In particular, advances in AI are transforming shopping. We can leverage basic machine learning to do Bayesian classifications, while deploying advanced natural language processing (NLP) to compare text semantics and deep learning to analyze images or mine information from them – resulting in deeper insights and greater transparency.

  • Machine learning is a field of computer science that uses statistical techniques to give computer systems the ability to “learn” (e.g., progressively improve performance on a specific task) with data, without being explicitly programmed. (wikipedia)
  • Natural language processing (NLP) is an area of computer science and artificial intelligence concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data. (wikipedia)

AI is also essential for cutting-edge speed-to-market. Retailers who introduce products to market at twice the average speed sell 63% more product over the first two years than retailers with below-average speed to market. Quickly capturing share of market, and maintaining it, is key to both the retailer and the CPG companies. Fast launches translate to higher sales, both initial and long-term, which is important considering that many product launches fail.

Launches fail when companies overestimate the quality of their data and underestimate the impact that errors and inconsistencies can have on their bottom line. According to Edward Deming’s 1-10-100 rule, if the cost of verifying bad quality data is X, then the cost of fixing it is 10X and the cost of not fixing it will amount to 100X. But then many organizations do not even know whether their data is good or bad, what’s missing or needed to remediate! What compounds this even further is that retailers all have their own schemas by which they way to receive product data from CPGs.

https://totalqualitymanagement.wordpress.com/2009/02/25/what-is-1-10-100-rule/

We help CPG companies on this front by leveraging AI solutions to automate otherwise manual tasks to significantly shorten time to market of an item. We do this by harmonizing different sources of information about the item, many times from across their organization, into a single source of truth. In the online world, everything about the product is data, so unsurprisingly, we start the process with an objective assessment of the quality of data that is available for a product. A set of algorithms helps assess the data requirements of our clients’ retailers from a conformance perspective. There are also, optionally, some other sets of algorithms that assess the requirements of strategic and tactical initiatives from a data perspective, including initiatives in health & wellbeing, Planogram compliance, and so on. This provides an upfront advantage to our clients: a comprehensive view of their data readiness and where specifically the data needs to be improved, all before the launch and for different use cases.

This in turn helps orchestrate a controlled workflow to identify sources of enhancement and enrichment, which include descriptive text, images, and other richer content that fills out the product data. Understanding the specific retailers’ requirements and using machine learning allows us to deliver consistent, high quality enhanced content that retailers use to drive high levels of customer experience.

Making Better Consumer Experiences

To take this further, we enrich content in different ways, depending on the category of the item. For example, food & beverage product information is enhanced by bringing in nutritional science and food regulations, employing a variety of AI techniques to populate the right attributes that address various stage of lifestyle, popular diets and FDA compliant attributes. These are then aggregated to help the shopper find the exact product that h/she desires, completing the sale.

Likewise, for beauty products, like soaps and cosmetics, our algorithms work to identify the right allergen attributes and configure the right kind of filters for the consumer. The end result is a better consumer experience, resulting in better sales.