Tredence http://www.teerthexport.com An Analytics Services and Solutions Company Wed, 24 Jul 2019 11:25:16 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.2 Bringing the promise of ML to your MDM : Part III http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-iii/ http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-iii/#respond Mon, 22 Jul 2019 12:11:46 +0000 http://www.teerthexport.com/?p=6439 We had studied the broad feature set of AI Data Cleanser in the last article and saw how it addressed key challenges to help enterprises manage their data better.

If you’ve been following along so far, congrats!

This could be an interesting read for you – as we look...

The post Bringing the promise of ML to your MDM : Part III appeared first on Tredence.

]]>
Thomas Varghese
Thomas Varghese
Analytics/Product Consultant

 

We had studied the broad feature set of AI Data Cleanser in the last article and saw how it addressed key challenges to help enterprises manage their data better.

If you’ve been following along so far, congrats!

This could be an interesting read for you – as we look some of the interesting use cases our customers have challenged us with.

We will cover 3 implementations addressing very different industries and customer needs – the situation that motivated the problem, the gap that prevented a straightforward resolution, AIDC implementation and the end result/impact.

As mentioned earlier, we generally deploy a subset of the different components of the product, as per the requirement.

With that said, let’s dive in!

Use case 1: Helping a large industrial firm validate and cleanse 500 thousand customer addresses across 25 countries, thereby helping them improve last mile delivery to customers.

Situation Gap AIDC Implementation
The client leveraged a product licensed on an annual basis, to validate its customer and vendor addresses 1. The product functioned in a black-box manner, without a clear definition of validation sources or criteria for validity of a record 1. We built a custom framework to help the customer leverage industry standard address verification sources by country
The product was unable to validate addresses in key growing markets, which led to operational impact for sales teams 2. There was no conclusion drawn on the business name located at an address 2. Incorporation of confidence metrics and validation categories helped end users understand the output better
3. User feedback for address validity/categories could not be provided 3. Coupled with the customer mastering solution, a validated set of records were created

Use case 2: Deploying a contact cleansing and enrichment solution for a technology firm deployed on salesforce, thereby helping them map validated contacts to internal sales teams

Situation Gap AIDC Implementation Impact
The client procured leads for marketing initiatives from multiple internal and syndicated sources. 1. Different sources presented different data formats, naming conventions, and standards. 1. We built a data discovery layer to ingest, unify and standardize data from over 12 sources. 1. The client was able to now leverage the solution to access a centralised, mastered list of contacts
The challenge lay in tying these siloed sources together, validating, cleansing and mapping leads to teams in order to plan, measure and track campaigns. 2. Due to manual inputs in the process, there were multiple duplicate and redundant records. 2. Deployed AIDC’s contact cleansing module helped identify and master a large number of duplicates, errors and exceptions from the dataset 2. Campaign tracking and reporting was made available to business leadership
3. A large number of leads were unverified and had insufficient attributes to execute a personalised campaign. 3. The cleansed dataset was then mapped with D&B to verify and enrich contact attributes 3. Subsequent pilot campaigns saw an average lift of 5% in click through rates, measured through multiple A/B tests

Use case 3: An interesting implementation for a large retailer; an automated product cleansing and hierarchy solution which transitioned the customer from an internal system to the GS1 standard

 

Situation Gap AIDC Implementation
The client is a large retailer based in the EU, that wished to shift its product hierarchy master data to the GS1 standard 1. Over ~28K unique SKUs had to be mapped from an existing hierarchy to an equivalent GS1 hierarchy 1. We identified alternate sources to verify and enrich the existing product data (product catalogs, e-commerce portals etc.)
This would help the retailer improve data accuracy and integrity, speed up supply chain responsiveness and simplify reporting across product categories 2. In many cases, product attributes were missing/inaccurate/insufficient 2. We deployed a custom ML-based mapping algorithm that matched products to GS1 with 3 levels – targeted match, synonym-based match, augmented match
3. There was inherent variation between the client’s internal hierarchy and the GS1 standard 3. User feedback was incorporated in sample outputs to re-train the model

These are just some of the types of problems that have been solved with AI Data Cleanser.

We hope you have enjoyed this series of articles detailing our offering to help your business improve the quality and reliability of data.

For any queries or a free demo, drop us at note – aidc@tredence.com.

The post Bringing the promise of ML to your MDM : Part III appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-iii/feed/ 0
Defining MTO and MTS Production Strategy and its Implementation http://www.teerthexport.com/blog/defining-mto-and-mts-production-strategy-and-its-implementation/ http://www.teerthexport.com/blog/defining-mto-and-mts-production-strategy-and-its-implementation/#respond Fri, 12 Jul 2019 06:50:16 +0000 http://www.teerthexport.com/?p=6337 A reliable vendor management system must execute performance-oriented deliverables within a stipulated time frame without errors. High-tech makers of industrial machinery are, thus, in constant lookout for such ‘Make-To-Order’ process settings...

The post Defining MTO and MTS Production Strategy and its Implementation appeared first on Tredence.

]]>
Bhaskar Seetharam
Bhaskar Seetharam
Associate Principal – Supply chain

One of the key metrices that almost all organizations work towards achieving is On-Time Performance, a measure of reliability of a process or an organization. To achieve a high on-time performance, the first and probably the most important step is to be able to give your customers the right delivery time-lines or delivery lead time. The right delivery lead time for product is defined either by the customers (in case where there are competitors/substitutes easily available) OR by the company based on internal constraints.

High-tech makers of industrial machinery typically are ‘Made-To-Order’ processes wherein the machines are either manufactured or assembled against a firm order and the customer who gives the order is willing to wait for his delivery. At the other extreme, commodity goods suppliers (like cement, salt manufacturers) generally are “Make-to-Stock” & maintain inventories at multiple echelons in their network, to ensure they are ready to serve demand from their consumers.

Unfortunately, most industrials and manufacturing based organizations tend to be a mix of MTO and MTS products. They constantly grapple with the problem of deciding which items to service from stock and which products to offer but produce only against orders. Should all fast movers be MTS and slow movers be MTO? Is MTS or MTO dependant on my competitors/substitutes? If an item once defined as MTS remain MTS forever?

In the following section, we discuss the conceptual definition of MTO and MTS.

Defining Make-To-Stock (MTS) and Made-To-Stock (MTO)

In its simplest form, there are two key factors that define whether an item or an order is MTS or MTO – Supply Lead Time (SLT) or the time it takes to produce/supply the item to a customer AND Customer expected Lead Time (CLT) or the time that a customer/consumer expects the item to be made available. An important point to note here is that both SLT and CLT are not specific to an ITEM but to an ORDER.

Simply put? –

Any order where the supply lead time (SLT) is less than the customer expectation lead time (CLT), the order can be referred to as “MTO”, otherwise the order is MTS

  • If SLT <= CLT then MTO
  • If SLT > CLT then MTS

Thus, the first level of MTO/MTS definition is at an Order level.

Issues with this Definition.

An important part of the above discussion is an organizations ability to judge the CLT. Though CLT is fundamentally impacted by the way a product is consumed (e.g. grocery items are typically required immediately and consumed daily), it is also impacted by a) Availability b) Technology c) Pricing d) Competition/Substitutes e) Others. What this also means (and as stated earlier) is that for the same item, while some set of consumers are willing to wait (CLT is high => MTO), another set of consumers may want them immediately (MTS). E.g. a specific brand of tea powder may be MTS in a certain geographic area, but may be treated as an MTO for orders coming in from other geographies

Some quick ways to assess whether a product is an MTS or MTO include

  • Are there competitors or like-for-like substitutes available for the product?
  • Does availability significantly impact the sales of the item?
  • What is the profile of the end consumer/customers? For e.g. the same product could be supplied to retail chain (MTS) and a commercial project (MTO).
  • Is our end consumer willing to quote a delivery date OR will they find an alternate source/product?

Though the MTO/MTS definition of an order or an SKU is typically customer backwards, companies tend to follow simplified “thumb-rules”. For example, one of the companies that we worked for, that was into manufacturing bathroom accessories, had a simple rule that all items that are Fast moving will be MTS and the rest of the SKUs will be treated as MTO.

This view of the industry (quite prevalent) does work to a large extent. It is a good enough approach that helps companies get it right on most occasions. But a few problems that this simple approach poses are –

  • A lot of manufacturing companies today serve a range of customers (retail, projects, distributors etc.); as discussed earlier, SKU can be MTS for a retail customer and MTO for a “projects” customer. In a manufacturing organization, with constrained capacity, to serve an MTO customer from stock (like MTS) is a crime as it is a wastage of limited capacity/resource.
  Retail Customers Distributors Project Customers
Fast Moving SKUs MTS MTS MTO/MTS
Slow Moving SKUs MTS/MTO MTO MTO

  • Abnormally large orders are served from inventory many times, depending on the customer type and ordering mechanism (as the item that has been classified as MTS). This leads to the item being unavailable for a large section of smaller customers, leading to urgency orders in manufacturing.
  • The MTS/MTO definitions, typically end up being static; items classified as MTO typically get de-prioritized (as there is no inventory) and the range is not upsold actively. This behavior over a period of time leads to a shrinking of the product range.

Suggested Solution to MTO/MTS classification:

With sufficient data available today across most organizations, we have built data-driven tools to classify SKUs and more importantly ORDERS into MTO and MTS

Tredence MTO-MTS solution:

Some of the key data points that we look at for MTO/MTS decision making are:

  • Expected Lead time by customer (CLT v SLT, Master data)
  • Sales Rate (Value, volume)
  • Sales frequency
  • Order volume distribution
  • Sales channel mapping (certain SKUs are offered in certain channels only)
  • Risk of Obsolescence (fashion?), damage etc.
  • Inventory Risk -Days of Cover (MOQ vs sales rate)
  • Cost of carrying inventory vs Margin (high value-low margin item?)
  • Order size (Elephant orders)

This data driven model that can assist in making the MTO/MTS decision dynamically and provide a planner with answers to –

  • Inventory planner – “Do I keep SKU A in stock or Not?”
  • Fulfillment planner – “Do I treat this new order in the system as an MTO or an MTS from units available in the inventory?”
  • Production planner –
    • “How much production line capacity should I allot to MTS and MTO (for MTO date planning)?
    • How do I prioritize between MTO and MTS orders?”

How does it Add Value to the Client?

  • Reduced inventory cost, capital cost (w.r.t damages, inventory obsolescence, etc.), cost of inventory, cost of damages/obsolescence. It also frees up WH space, optimizes working capital and thus, reduces the overall cost to company.
  • Better capacity utilization and reduced production losses – with lesser instances of procurement expediting expenses.
  • Improves Sales through improved stock/product availability.
  • Improved On-Time Delivery performance (right use of capacity and right expectation setting).

All in all, the Customer Lead Time analysis and management can influence the MTO-MTS productivity. With right approach and assistance, not only can the CLT be met but working capital and inventory costs can be reduced.

 

The post Defining MTO and MTS Production Strategy and its Implementation appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/defining-mto-and-mts-production-strategy-and-its-implementation/feed/ 0
Product Management in the data science world http://www.teerthexport.com/blog/product-management-in-the-data-science-world/ http://www.teerthexport.com/blog/product-management-in-the-data-science-world/#comments Fri, 28 Jun 2019 06:31:03 +0000 http://www.teerthexport.com/?p=6179 The topic related to ‘Product Management’ has received quite a flake in recent years. Several rounds of discussions have happened to create an analogy out of client’s stand point.
As I heard more of these conversations, there ...

The post Product Management in the data science world appeared first on Tredence.

]]>
Sagar Balan
Sagar Balan
Principal, Customer Success

The topic related to ‘Product Management’ has received several laurels in recent years. Several rounds of discussions have happened to create an analogy out of client’s stand point.

As I heard more of these conversations, there was an uncomfortable ambiguity stemming from disbelief – is this another fad or is there meaning to it? Well, the initial rumblings were from the cool kids in the bay. But, why did grounded Midwest and shoot-from-the-hip south latch on? Must be something deeper, right?!

Product management has been around forever in the software, e-commerce world. But, today, mainstream IT and AI teams in fortune 500 companies are thinking of a product paradigm shift. Leading consulting firms are also developing products or beefing up their technology as an eventuality.

But, the question that begs attention here is – why products? What happened to software as a service, platform as a service, ML as a service? Do we need another paradigm shift? Or as the saying goes – Old wine in a new bottle?

IT teams are today being led by progressive Chief Digital Officers, Chief Data officers. Conventionally, CIOs have been leveraging their value by app dev teams, BI teams, infrastructure teams et al. While this may have become a table stake, it has been around for a while already. The question is – ‘How to deliver incremental value to business?’

So, what has changed?

Demand:

IT is today called upon to be a true business partner. And, given the rate at which business is facing change, the time to deliver value is compressed.

Glocal innovation:

For a fortune 500 firm operating globally, innovation is striking at its core from multiple directions. While the USA is still the biggest revenue (EBITDA generation engine), problem and solution innovation is happening in other markets faster than the USA. For starters, they have less legacy to deal with. The local markets are facing competition from nimbler players. VC money is flowing into firms in China, Israel, Korea, India which are encountering newer problems in e-commerce, voice commerce sectors. Other traditional revenue generating markets, individually facing slower growth, find it difficult to make a business case to invest in solutions led by such innovations.

Problem repeatability:

This is going to sound rhetorical. But, I must state it because it is relevant. Business problems in today’s enterprise are constantly changing. Few of them get recreated, and hence are not available in large volumes. Few others are becoming common across markets and thus moving into a constant state of being a tightly defined problem that can be applied globally. Repeatable.

A good indicator to this is AWS recent product launches – out of the box image, text, voice, reinforcement learning, forecasting. Common problems which are finding repeatable solutions.

The AI candy shop:

Today, nobody wants to use process automation tools that are not embedded in intelligence. Passé, inefficient. Wallstreet, investors and boards are lapping up the buzzwords – cognitive, AI, embedded ML.

Cloud enabling global scalability:

Cloud platforms such as Azure, AWS have ensured that once you have these AI capabilities developed, they can be deployed globally. The global-local adaptation is a key design criterion in this context.

Glocal solution adaptation…er,… maybe Glocal problem adaptation:

Each market has its secret sauce in terms of the market structure, drivers and customer nuances. Thus, before adapting a solution from one market to the other, it is essential to adapt the problem as well. For example, it is an interesting pursuit to adapt the problem structure from the modern trade Australia market to half way across the world in Brazil.

And, then adapt the solution.

So, who’s game is it anyway?

Given the above guard rails, it is quite evident that the business case should be developed by a country specific P&L or ROI measure. It must be a global mandate. IT is one of the few functions which is ideally poised to ride this wave. That, they own the data systems is coincidental. Or, well.. was that the whole plan! Go, Trojan..

Finally, after rambling about half the things in the world – we come to the initial topic of this article. Products. Why?

A product has a life – it evolves constantly. The focus is on continually making the best product for its end user, ever possible. It has a roadmap. In a world of multiple users, it needs a strong owner who plans and decides well. It has a clear value proposition in each update/release. It can be developed in a sprint like manner. It can be defined with a bounded scope and sold internally in enterprises, with greater ease. And, be defined, abstracted, customized for a global roll out.

Looks like a duck, walks like a duck, sounds like a… must be a duck. Yes, I guess it does look like a product.

But, how do we help organize people and teams to get the products rolled out?

While the below roles are common to a product-oriented firm, the thought process is different from conventional IT projects. Sharing of resources across projects being the biggest drawback. The smartest of each of the below folks will perhaps still fail, without an organizing framework. The roles to work in a closely integrated manner, dedicated to making a single product successful.

Product Designer:

The role of a product designer is someone who can completely immerse himself in the shoes of the end user, without worrying about the AI or Tech related issues that may occur sometimes. Just solve the end user’s real problem and keep tracking the end user’s behaviour as the product usage evolves. In product management, there is a contradictory school of thought which mandates that the designer must appreciate “how” a product works. This, however, might dilute the designer’s objective of empathizing with the end user.

Product owner:

A functional expert of impact analysis who can connect the dots and identify the nuances of each problem. A great problem solver, with functional expertise, has the knack to see through the commonalities, and the uncommon aspects too. Prioritization between the must-haves, nice-to-haves and must-not-haves is a key skill required in the role.

Product BAs

Products are quite massive in terms of their scope today. Primarily, each product usually is broken down into sub products which are owned by individual product Bas.

The AI solution developer(s)

Usually, it is very difficult to get a product owner who really gets AI solution development. By and large, individual intelligence is anyways overrated. It is important to have a dedicated AI solutioning team which can translate the problem into a modular AI solution.

The AI deployment team

It is not enough to develop a modular AI solution. To be able to deploy it in globally scalable platforms requires seasoned IT big data engineering & testing capabilities. The plumbing and wiring required to take the AI concept to enterprise last mile reality is no mean task. It is a specialized function. Truly speaking, they give the product its real-life form.

Scrum & Program Managers

Last but not the least, you need the scrum team and program managers. Everyone benefits from their discipline and order amidst the chaos.

So, what kind of product management tools would you require to deal with the existing concerns within your organization?

All said and done. Is it enough to stand up a product team and deliver the product? More to come in the next article – adoption ..

The post Product Management in the data science world appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/product-management-in-the-data-science-world/feed/ 1
Bringing the promise of ML to your MDM : Part II http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-ii/ http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-ii/#respond Thu, 27 Jun 2019 09:44:38 +0000 http://www.teerthexport.com/?p=6161 'Augmented data management' is a key trend where AI/ML is transforming how enterprises manage their data.
In the last article, we looked at some of the key pain points that exist as IT and business leaders ...

The post Bringing the promise of ML to your MDM : Part II appeared first on Tredence.

]]>
Thomas Varghese
Thomas Varghese
Analytics/Product Consultant

‘Augmented data management’ is a key trend where AI/ML is transforming how enterprises manage their data.

In the last article, we looked at some of the key pain points that exist as IT and business leaders constantly grapple with the increasing influx of data sources, without systems to keep up.

Let us look at what makes AI Data Cleanser address these pain points.

So what is AI Data Cleanser?

AI Data Cleanser is a suite of AI/ML based data management tools that aims to deliver reliable data to your business.
The image below illustrates the breadth of issues that typically exist, and the specific entities/use cases the solution addresses.

Let’s look at why each of these use cases is crucial to tackle from a foundational perspective.

  • Data validation – Your business units constantly refer to master data as a ‘source of truth’; it could range from critical data such as customer shipping information, a lead’s contact email or employee phone number. Maintaining, updating and constantly checking data validity to ensure business sees the right information is a key determinant of the quality of downstream decisions made.
  • Data cleansing – Enterprises rarely have their master data in one source – standardized, cleansed and ready to go. The reality is that each data entity is sourced from traditional platforms (CRMs, ERPs), flat files, external sources, etc. and this often leads to redundant and duplicate information. AI Data Cleanser leverages powerful machine learning models to identify similar entities, group them and assign a representative ‘golden record’ that helps the business identify and tie all relevant information to that unique record.
  • Data enrichment – Every firm is well on its way to using internal data for decision making; however, the wealth of information present outside your firewall could help provide key insights on multiple fronts. This is where firms are keen to compete and gain a competitive advantage. As an example –
    • What if you could tie each of your customers to their parent firms, and actually identify white space opportunities to grow your business?
    • What if you could validate and enrich your product attributes, while also analyzing relative assortment and competitive pricing trends on e-commerce sites?
  • Hierarchy management – Hierarchies are a tough nut to crack, as they often combine the problems of the above use cases, and add complexities of their own. However, a robust hierarchy mapping of your customers, contacts, products and materials can be invaluable in gaining a 360 view of your business and providing opportunities to grow revenue while controlling cost.
    • An interesting application of hierarchy management is product category standardization (in this case, to the GS1 standard), which we have implemented for a few large retailers in the EU. This migration helped our customers streamline product lines, optimize their supply chain and rationalize their supplier portfolio.
  • Data anomaly analysis – “What is the state of your data quality”, “What are your data quality challenges”?; These questions could either prompt a blank silence or lengthy answers without a clear direction. The reality is that data quality metrics in any firm is complex due to the multiple issues we have established. However, solving any of the use cases we have seen above without providing business users and IT teams with custom reporting and insights into their data health is only solving part of the problem. AI Data Cleanser leverages ML driven anomaly detection tools to test variations in data, and also allows custom business rules to be defined; thereby leveraging clear data quality standards to be tested, in order to measure and improve data quality.

How does AI Data Cleanser work for you?

  • Data monitoring – Here is where we close the loop on the feature set. AI Data Cleanser is built to integrate into your environment and run projects on an ongoing basis. As user feedback is provided, model accuracies increase, thereby improving automation and data quality KPIs. AI Data Cleanser is configured based on use case(s), and comes with a managed services team that works to scope out client specific requests and enhancements that are to be built as part of the project.
    • The advantage with AI Data Cleanser is its modular plug-and-play model, where different use cases use single or multiple components of the solution.
    • For example, a use case to cleanse and master customer data from Salesforce would have a workflow quite different from a use case to validate customer addresses and build customer hierarchies.
    • In other words, the solution is priced and deployed as per customer needs and integrated with the systems/processes they use currently.

AI Data Cleanser connects to a range of input systems, ingests data through a “data discovery” layer, where data is unified and standardized, and then processes data based on the configuration defined.

The solution is cloud compatible as well as on-prem friendly and presents multiple options for integration.

AI Data Cleanser has seen a number of successful implementations with varying scale, right from a simple customer validation and de-duplication exercise all the way to replacing an enterprise MDM platform for address validation.

In the last part of this series, we will explore some of the different flavors of the implementations done through AI Data Cleanser.

Learn more about AI Data Cleanser, and reach out to our team for a free demo – www.teerthexport.com/ai-data-cleanser/

The post Bringing the promise of ML to your MDM : Part II appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-ii/feed/ 0
The New Age of Customer Satisfaction http://www.teerthexport.com/blog/the-new-age-of-customer-satisfaction/ Mon, 17 Jun 2019 10:45:41 +0000 http://www.teerthexport.com/?p=6107 Let us start with an oft repeated question,” What do you know about your customer’s preferences”?
The answer could be any of the standard responses which talk about their tastes in your merchandise based on past transactional records...

The post The New Age of Customer Satisfaction appeared first on Tredence.

]]>
Crishna Carthic
Crishna Carthic
Senior Manager

Let us start with an oft repeated question,” What do you know about your customer’s preferences”?

The answer could be any of the standard responses which talk about their tastes in your merchandise based on past transactional records. It could be also one of the slightly more personalised answers which talk about the customer’s likes and dislikes basis whatever they have filled in their surveys and feedback forms. Does this tell you all you need to know about your customers? Does this help you make the customer experience of that customer something which he/she will remember? Something that gets ingrained into the sub-conscious decision-making component of their minds. That is the holy grail which most CX organisations are after.

Where does data come into the picture?

With 91 properties around the world, in a wide variety of locations, the Ritz-Carlton has a particularly strong need to ensure their best practices are spread companywide. If, for example, an employee from their Stockholm hotel comes up with a more effective way to manage front desk staffing for busiest check-in times, it only makes sense to consider that approach when the same challenge comes up at a hotel in Tokyo. This is where the hotel group’s innovation database comes in. The Ritz-Carlton’s employees must use this system to share tried and tested ideas that improve customer experience. Properties can submit ideas and implement suggestions from other locations facing similar challenges. The database currently includes over 1,000 innovative practices, each of them tested on a property before contributing to the system. Ritz-Carlton is widely considered to be a global leader in CX practises and companies like Apple have designed their CX philosophy after studying how Ritz Carlton operate.

What does this tell you- Use your Data wisely!

The next question that may pop up is, “but there is so much data. It is like noise”. This is where programmatic approaches to analysing data pop up. Analytics and data sciences firms across the globe have refined the art of deriving insights out of seemingly unconnected data to a nicety. What you can get out of this is in addition to analysing customer footprint in your business place, you get to analyse the customer footprint across various other channels and social media platforms.

As a sample, investigate this infographic created for one of our customers:

The New Age of Customer Satisfaction

This aims to profile the customers who are most susceptible to local deals/rewards/coupons basis their buying patterns.

How is this done? The answer is rather simple. Customer segmentation algorithms (both supervised and unsupervised) enable you to piece together random pieces of information about the customer and analyse the effect they have on a target event. You will be surprised at the insights that get thrown out of this exercise. Obviously caution needs to be exercised to ensure that the marketeer doesn’t get carried away by random events which are purely driven by chance.

Okay- so I have made some sense out of my data. But this is a rather cumbersome process which does not make any difference to the way I deal with my customer on a day-to-day basis.

“How do I get this information on a real-time basis so that I can actually make some decisions to improve my customer’s experience as and when it is applicable?”

This takes into the newest and most relevant trend into making data sciences a mainstream part of decision making. How do we integrate this insight deriving platform into the client’s CRM system so that the client can make efficient decisions on a real time basis?

In Tredence, for one of our leading technology clients, we have built an AI-based orchestration platform which derives the actionable insights from past customer data and integrates this into the customer’s CRM system so this becomes readily available to all marketeers as and when they attempt to send out a communication to their customers.

What does this entail? This entails using the right technology stack to build a system which can delver insights from the data science modules at scale. I prefer calling it out as a synergy of both data sciences and software development. Every decision that a marketeer is trying to make must be processed through a system which will invoke the DS algorithms in-built on a real time through the relevant cloud computing platforms. Insights will be delivered immediately, and suitable recommendations will also be made on a real-time basis.

This is the final step in ensuring that personalised recommendations being made to every customer are truly personalised. We in Tredence call it “The Last Mile adoption”. This development is still in its nascent phase. However, companies would be wise to integrate this methodology as a part of their data science integrated decision making since it is very unlikely that they will hit the holy grail of customer satisfaction without delivering real-time personalised recommendations.

The post The New Age of Customer Satisfaction appeared first on Tredence.

]]>
Applications of AI in Document Management http://www.teerthexport.com/blog/applications-of-ai-in-document-management/ http://www.teerthexport.com/blog/applications-of-ai-in-document-management/#respond Wed, 12 Jun 2019 07:14:29 +0000 http://www.teerthexport.com/?p=6037 “We are drowning in information, but starved for knowledge”
This is a famous quote by John Naisbitt which shows the key difference between information and knowledge.

The post Applications of AI in Document Management appeared first on Tredence.

]]>
Pavan Nanjundaiah
Pavan Nanjundaiah
Head of Solutions

We are drowning in information, but starved for knowledge

This is a famous quote by John Naisbitt which shows the key difference between information and knowledge. Advancement in data engineering techniques and cloud computing have made it easy to generate data from multiple sources but making sense of this data and getting insights is still a huge challenge. The data volumes have now increased exponentially and along with the traditional structured data, data can now reside in different formats like unstructured social media text, log files, audio/video files, streaming sensor data etc.

Applying manual methods to process this diverse data is not only time consuming and expensive but is also prone to errors. Hence the need of the hour is to use Artificial Intelligence (AI) based automated solutions that can deliver reliable insights and also give a competitive advantage to customers. Here are few examples of how customers across industries can benefit from AI driven solutions.

Microsoft Azure based AI solution

In 2017, more than 34,000 documents related to John F Kennedy’s assassination were released. The data volume was huge, and data existed in different formats like reference documents, scanned PDF files, hand written notes and images. It would take researchers months to read through this information and hence manually reviewing this data was not the most optimal solution. Microsoft Azure team applied AI based Cognitive Search solution to extract data from these diverse sources and gained insights. Technical architecture for this use case was built using Azure Cognitive Services components like Computer Vision, Face Detection, OCR, Handwriting Recognition, Search and core Azure components like Blob Storage, Azure ML, Azure Functions and Cosmos Database. This solution also annotated text using custom CIA Cryptonyms.

Hospitals usually deal with a lot of patient data which could reside in electronic medical records (EMR), handwritten prescriptions, diagnostic reports and scanned images. AI based Azure Cognitive Search could be an ideal solution to efficiently manage patient’s medical records and create personalized treatment plan. Many downstream use cases like Digital Consultations, Virtual Nurses and Precision Medication can be built once the patient data is optimally stored.

Google Cloud Platform (GCP) based AI solution

GCP introduced Document Understanding AI (beta) in Cloud Next 19. This is a serverless platform that can automate document processing workflows by processing data stored in different formats and building relationships between them. This solution uses GCP’s vision API, AutoML, machine learning based classification, OCR to process image data and custom knowledge graph to store and visualize the results. Customers can easily integrate this solution with downstream applications like chatbot, voice assistants and traditional BI to better understand their data.

Customers who deal with Contract Management data like Mortgages are usually faced with a lot of manual tasks to ensure that the contracts are complete and accurate. This could mean processing contracts in different formats/languages, reviewing the supporting documents, ensuring that the details are accurate and complies with regulatory standards across documents. By using Document Understanding AI and integrating it with a well-designed RPA framework, customers will be able to efficiently process Mortgage applications, Contracts, Invoices/Receipts, Claims, Underwriting and Credit Reports.

Use cases from other industries

Document Management AI solution can also be applied to diverse use cases from other industries like processing claims related to damages to shipped products by e-commerce companies, handling know your customer (KYC) process in the banking industry, invoice data processing by Finance teams, fraud detection during document processing etc.

As more and more companies embrace the digitization wave, they will be faced with different variations of data/document management challenges. Based on the current trend, number of use cases are only going to increase and an AI driven solution is probably the most efficient way to solve this problem as it can reduce manual work, save cost and deliver reliable insights. This will ensure that companies can spend more time on building their business and less time on manually processing documents and data preparation.

Going back to John Naisbitt’s quote, AI and ML driven solutions are probably the only way to bridge the gap between information and knowledge.

The post Applications of AI in Document Management appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/applications-of-ai-in-document-management/feed/ 0
Bringing the promise of ML to your MDM: Part 1 http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-1/ http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-1/#respond Thu, 06 Jun 2019 07:42:45 +0000 http://www.teerthexport.com/?p=5901 Enterprises are rushing to transform themselves, and embrace the promise of digital transformation; and while the means to achieve this end are disputed, there is unanimous agreement on the fact that reliable data is the starting point.

The post Bringing the promise of ML to your MDM: Part 1 appeared first on Tredence.

]]>
Thomas Varghese
Thomas Varghese
Analytics/Product Consultant

Enterprises are rushing to transform themselves, and embrace the promise of digital transformation; and while the means to achieve this end are disputed, there is unanimous agreement on the fact that reliable data is the starting point.

Tools to facilitate your decision supply chain, starting from vanilla BI reporting to the most complex AI/ML predictive algorithms, are only as good as the data you start with.

So, how are enterprises trying to achieve “reliable data” today?

There are, of course, a number of solutions – ranging from investing in traditional MDM platforms to data aggregators/enrichment providers, and even emerging ML-based cleansing tools.

However, if you are a decision maker tasked with ensuring the availability of clean and reliable data for your business, you live in a world of challenges; let me try and get you nodding about some of them.

  1. The curse of legacy systems:

The data value chain from capture -> ingestion -> storage -> management -> analytics & insight is rapidly maturing, and this means that enterprises are often stuck with legacy systems which were set up as siloed, decentralized sources; these systems rely heavily on manual inputs & checks from IT teams, and fail to keep up with increasing data sources that business wants to analyze and understand.

As a result, you are constantly dealing with new complexities in a world where current systems are already not at their best.

  1. The promise of the cloud:

Cloud migration offers cost-effectiveness, greater agility, increased feature sets and eventually present greater possibilities with your data; however, besides being a strategic decision, this state poses additional complexities in how you choose your tech stack and lay out a roadmap that ensures your data management challenges are addressed while dealing with the cloud.

  1. The need to solve effectively, with agility:

Business leaders generally need to build a business case before investing in any solution; and rather than have a one-size-fits-all, they typically want to solve high priority use cases as part of a larger roadmap. The need is a solution that is agile enough to be customizable and rapidly deployed to a use case.

  • So, how do I prioritize my challenges and demonstrate value on key business initiatives with a quick turnaround?
  • Secondly, how can machine learning and AI help my data get more accurate and reduce manual efforts?
  1. The fear of poor RoI:

Established enterprise vendors in the market typically sell software with multiyear agreements, long implementation cycles, require expensive licensing & specialized stewardship. Not to mention, the data still requires manual preparation, cleansing & quality checks before the first drop of insight falls from the tap.

  • The key question here is how do I deal with obvious questions of RoI, accuracy improvements, ongoing maintenance to ensure the business continues to get quality data as needs evolve?

With this world of challenges, what are some must-haves from a potential solution?

  • A solution that integrates with a variety of legacy sources, with capabilities to unify any flavor of master data (include customer, contact, vendor, product, material, etc.)
  • A solution that presents multiple options for integration and usage, with cloud, on-prem and hybrid configurations
  • A solution with pre-built ML modules and training sets that can deliver rapid proof of concepts that address pressing business needs, while also improving as business users provide feedback
  • A solution that is white-box and comes with managed service offerings that can address ongoing needs of enhancements

Note that regardless of your current data maturity, these are some of the core pillars & use cases that will need to be solved.

And that is why, we built AI Data Cleanser, an AI/ML based data management suite to deliver reliable data to your business.

AI Data Cleanser is an ‘augmented data management’ solution, which Gartner has called out as a key data & analytics technology trend for 2019.

Learn more about AI Data Cleanser

In the next part of this series, we will explore AI Data Cleanser in more detail – features, architecture, differentiators and some key implementations where we have driven success for our clients.

The post Bringing the promise of ML to your MDM: Part 1 appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/bringing-the-promise-of-ml-to-your-mdm-part-1/feed/ 0
Digital Transformation / Industry 4.0 http://www.teerthexport.com/blog/digital-transformation-industry-4-0/ http://www.teerthexport.com/blog/digital-transformation-industry-4-0/#comments Fri, 17 May 2019 06:27:09 +0000 http://www.teerthexport.com/?p=5691 Digital Transformation / Industry 4.0 is on everyone’s mind. Investors are happy to hear from organizations that they are embarking upon a complete Digital Transformation journey. Investors love it, leaders advocate for it, directors have to make it a reality, managers have to design for it, but few understand what all it means in the grand scheme.

The post Digital Transformation / Industry 4.0 appeared first on Tredence.

]]>
Ashwin Avasarala
Ashwin Avasarala
Sr. Engagement Lead

Digital Transformation / Industry 4.0 is on everyone’s mind. Investors are happy to hear from organizations that they are embarking upon a complete Digital Transformation journey. Investors love it, leaders advocate for it, directors have to make it a reality, managers have to design for it, but few understand what all it means in the grand scheme.

Hopefully, we can simplify this world for you.

What is Digital Transformation? Let’s keep it simple.

The simplest way to describe Digital Transformation is “Using Digital technology, innovation and intelligence to find better ways to do various things that organizations do today. It’s not about creating something new, but more about improving effectiveness and efficiency of existing processes for better business outcomes.”

Digital Transformation started as Industry 4.0 in some places. However, the idea remains the same. While Industry 4.0 started with the intention of transforming the manufacturing processes using Digital Technology, the principles of Digital Transformation now apply to all functions across the organization.

How does this theory apply in practice? Let’s study an example:

Step 1 – Current State

Map out the current process to uncover gaps that can be filled with better technology or intelligence.

Consider a global paper products manufacturing company. The manufacturing team is constantly trying to find opportunities to improve efficiency and productivity and reduce costs.

  1. Energy consumption is a big area of focus for the manufacturing team. Currently, manufacturing reports and energy dashboards are used to track the consumption of energy across a few important machine parts.
  2. Operators use these dashboards to identify sections of machines that are in green/red (good/bad) zones in terms of energy consumption and adjust the settings to optimize energy consumption.
  3. These dashboards only track a limited set of machine parts that influence energy consumption.

Step 2 – Future State

Outline what the future should look like, after Digital Transformation.

Energy consumption of machines at the mill (specific reference to Tissue Machines) can be reduced by finding the key driving factors of energy consumption, determining their optimal settings while factoring in for the production constraints in terms of time, quantity and quality.

The following challenges will have to be addressed to get to the future state

  1. There are a few hundred variables in a tissue machine that determine the energy consumption. These machine variables have to be studied comprehensively to identify the key influential factors for energy consumption. Relationships between these variables also need to be considered.
  2. A detailed and statistically robust mechanism is created to generate insights/correlations across all relevant machine variables, to take proactive steps to minimize energy consumption.
  3. Study the process characteristics that influence energy consumption and optimize them. E.g. machine speed, maintenance schedule, aging of parts.

Step 3 – Establish how technology, data and analytics can bridge this gap.

The best way Digital Transformation approach for this example would be:

  1. Select a machine, in a market, which can be managed and monitored easily. Maturity in terms of capturing data, and the groundwork that has already been achieved for manufacturing systems and lean energy dashboards provides an immediate feasibility in terms of execution and adoption.
  2. Build a Driver Model to understand key influential variables and determine the energy consumption profile.
    1. Identify Key Variables –
      1. There are ~ 600 machine parts that drive the consumption of energy of a tissue machine. First, shortlist the top contenders and eliminate the non-influencer variables, using inputs from technical teams and plant operators.
      2. Identify primary drivers among the selected machine variables using variable reduction techniques of Machine Learning.
    2. Driver Model –
      1. Building multivariate regression models to understand the impact of top drivers of energy consumption using techniques like linear regression, RIDGE/LASSO regression, Elastic Nets.
    3. Optimize the engine to lower energy consumption.
      1. Optimize energy consumption by identifying the right combination of drivers under the given production constraints – time, quantity and quality.
      2. Create a mechanism to provide guidance during the actual production hours (In-line monitoring).
        1. Track energy consumption of the machine parts and their active energy consumption states. Identify deviation from the standards.
        2. In case of deviation, provide guidance to machine operators to bring the energy consumption to within defined limits.
    4. Adoption
      1. Real-time dashboards, refreshed weekly, provide charts on energy consumption, recommendations, and improvements achieved through proactive measures.
      2. Post-live support to operations teams to enable adoption.
    5. Scaling
      Determine phased roll-out to other machines using

      1. Strategic initiatives.
      2. Machines or mills which utilize higher amounts of energy to target higher ROI.
      3. Similarity in process and parts characteristics of tissue machines.
      4. Data availability and Quality.
      5. Readiness and groundwork for adoption by plant operators and energy management teams.

4 key stages in Digital Transformation

How should you, as a leader in an organization, look at Digital Transformation? Organizations should consider the 4 key stages of Digital Transformation, in order to create a sustainable impact on their organization. To make Digital Transformation a reality, all these steps cannot work independently. The philosophies of Design Thinking are embedded in the framework’s interconnected elements.

DEVELOPMENT PHASE:

Focus is on identifying the key areas and prioritizing the Digital Transformation efforts

Stage 1 – Discovery

Identify the key areas of opportunity or risk and related key stakeholders. Detail out the gaps in process, data, insights or technology, fixing which would help capture opportunities or mitigate risks.

Stage 2 – Design

Rapid iterations on design and implementation of prototypes helps reach optimal solutions faster. Build out Proofs of Concepts (PoC) to establish the theoretical validity of the approach. Validate the practical validity of the approach through a Proof of Value (PoV).

IMPLEMENTATION PHASE:

Implementation needs to account for limitations arising from human behaviour and scale of the operations.

Stage 3 – Adoption

Building solutions that keep the user at the center of the design, is key to adoption. This means that users must be included in the design and feedback early on. In addition, there should be support for users post design, in the form of FAQs, training videos, chatbots etc.

Stage 4 – Scalability

If we can’t solve a problem at scale, then the solution does not solve organizational problems. The issues that we anticipate at scale, should be accounted into the design in the Development phase. This means considering the technology used, the infrastructure required, process automation possible / required and how to manage future developments.

Digital Transformation / Industry 4.0 is on everyone’s mind. Investors are happy to hear from organizations that they are embarking upon a complete Digital Transformation journey. Investors love it, leaders advocate for it, directors have to make it a reality, managers have to design for it, but few understand what all it means in the grand scheme.

Like Design Thinking would dictate, the Development phase of the Digital Transformation processes have to always consider the Implementation aspects.

Digital Transformation is no longer just optional.

Every organization is transforming the way they do business. Numerous organizations like BASF, Mondelez, KLM airlines, Aptar group, PepsiCo etc. are already making massive strides in this area.

If you want to zip past your competition, or even stay competitive, it’s about time you started thinking about how to transform the way to do business. After all, there’s no growth in comfort.

The post Digital Transformation / Industry 4.0 appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/digital-transformation-industry-4-0/feed/ 1
Supervised stack ensemble with natural language features: Driving Customer Service Optimization http://www.teerthexport.com/blog/supervised-stack-ensemble-with-natural-language-features-driving-customer-service-optimization/ http://www.teerthexport.com/blog/supervised-stack-ensemble-with-natural-language-features-driving-customer-service-optimization/#comments Wed, 29 Nov 2017 12:25:48 +0000 http://www.teerthexport.com/?p=1408 In the age of social media, companies are conscious about the reviews that are posted online. Any act of dissatisfaction can be meted out by way of tart sentiments on these platforms. And so enterprises strive hard to give 100% positive experience, by doing all that they can to address customer...

The post Supervised stack ensemble with natural language features: Driving Customer Service Optimization appeared first on Tredence.

]]>

Saurabh Vikash Singh
Saurabh Vikash Singh
Associate Manager, Tredence

Naveen Mathew Nathan S
Naveen Mathew Nathan S
Associate Manager, Tredence

In the age of social media, companies are conscious about the reviews that are posted online. Any act of dissatisfaction can be meted out by way of tart sentiments on these platforms. And so enterprises strive hard to give 100% positive experience, by doing all that they can to address customer grievances and queries. But like they say, there are slips between the cup and the lip – not all grievances can be handled amicably.

Let’s take the specific case of call centers here. Their Service Level Agreement mentions terms like number of calls answered at a certain time of the day, percentage of calls answered within a specific waiting time, etc. Ensuring customer satisfaction and retention requires a far deeper, more holistic view of interaction between customer care representative (agent) and caller. There are other KPIs such as what causes a customer to be dissatisfied and number of escalations. But these seldom find a place in the SLA.

In this article, we will talk about identifying drivers of (dis)satisfaction and come up with ways to improve it. In the course, we will touch up on the solution design that can scale and institutionalize real-time decision making.

Introduction

We’ve all done it, dialing the call center for any issue encountered. We are surely an expressive bunch when it comes down to rattling our emotions and spitting out our dissatisfaction. And if that is not enough, we threaten to let our dissatisfaction be known to the rest of the world – through social media, not to mention #CustomerExperience.

While standard surveys exist to capture the sentiments of customers, the percentage of people filling these surveys is very low. This compounds the problem of effectively addressing customer needs.

Automating the task of predicting customer satisfaction requires a balanced mixture of text mining, audio mining, and machine learning. The resulting solution needs to:

  • Scale and be deployable
  • Identify the drivers of dissatisfaction
  • Generate actionable insights and generalize well to the population

Modeling Pipeline

Modeling pipeline includes all the components (data ingestors, model builder, model scorer) that are involved in model building and prediction. It is mandatory for the modeling pipeline to seamlessly integrate all the components for it to be scalable and deployable – production worthy. These components vary depending on the problem, available architecture, tools used, scale of the solution and turnaround time. The following pipeline was built in Google cloud to solve the problem of dissatisfaction in call centers.

Modeling (actual work – driver identification)

In the above problem, the satisfaction survey showed good internal consistency. Calls, emails and chats had sufficient discriminatory power to model customer satisfaction. Exploration of the data showed that the patterns were non-linear. However, like other psychometric models, the satisfaction model was plagued by three major issues which threatened its external consistency: shortage of data, variance and instability. These problems were addressed in the following manner:

First, the issue of data shortage was solved using resampling (bootstrapping). Second, the challenge of model instability was resolved using k-fold cross validation for tuning hyperparameters of different models. This was followed by model averaging. Finally, the issue of model variance was solved using stack ensemble approach on bootstrap samples. Several classification algorithms were used to build the first layer of the stack. Logistic regression was used to predict the outcome by combining the results from the first layer. The accuracy thus obtained was superior to that of any individual model in the first layer of the stack.

Driver Analysis

Only two types of classification models are directly interpretable: logistic regression and decision tree. Interpretation of other Machine Learning techniques such as regularized regression and regression splines require knowledge of calculus, geometry and optimization. Machine Learning models such as support vector machine and neural networks are considered black box techniques because of the high dimensionality, which is difficult for the human brain to comprehend.

Standard measures of variable importance exist for commonly used black box techniques such as SVM and neural networks. Simple weighted average method is used to calculate the importance of variables in the stack ensemble, with the weights being determined by the logistic layer. However, it is important to note that the final importance is not a measure of linear dependence of satisfaction on the independent variables. The importance metrics need to be combined with business intuition and actionability to provide recommendations for improving customer satisfaction.

Consumption

A call center manager would like to track customer satisfaction level along with several KPIs that are critical to operation. Information related to utilization of customer care representatives is provided to the manager in real-time. Model prediction is run in semi-real-time to reduce the required computational power. The manager is provided with options to deep dive into historical data based on variables that are drivers of dissatisfaction. For example, calls can be redirected to customer care representatives by existing ERP systems based on their history and subject matter expertise. This reduces the number of escalations and enables near real-time actionability without significantly affecting other KPIs.

The problem of customer dissatisfaction in call centers can be solved using audio mining, text mining and machine learning. Intelligent systems greatly reduce the stress on customer care representatives by automating majority of the processes. These cloud-based systems can be seamlessly integrated with existing ERP systems to provide highly actionable insights about dissatisfaction without significantly affecting other critical KPIs that are related to call center operations.

The post Supervised stack ensemble with natural language features: Driving Customer Service Optimization appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/supervised-stack-ensemble-with-natural-language-features-driving-customer-service-optimization/feed/ 1
Just ask Alexa, The machine in the corner room http://www.teerthexport.com/blog/just-ask-alexa-the-machine-in-the-corner-room/ http://www.teerthexport.com/blog/just-ask-alexa-the-machine-in-the-corner-room/#comments Fri, 06 Oct 2017 11:10:59 +0000 http://www.teerthexport.com/?p=1279 AGCS (Alexa, Cortana, Google, Siri), as I fondly call these services, certainly have taken over my life. I talk to them every day. I take their help in research. I tell them to remind me of important tasks. I even ask them turn on / off different appliances at home. And they never complain! Now who yells at me or more...

The post Just ask Alexa, The machine in the corner room appeared first on Tredence.

]]>
Ganesh Moorthy
Ganesh Moorthy
Director – Engineering, Tredence

AGCS (Alexa, Cortana, Google, Siri), as I fondly call these services, certainly have taken over my life. I talk to them every day. I take their help in research. I tell them to remind me of important tasks. I even ask them turn on / off different appliances at home. And they never complain! Now who yells at me or more importantly who do I yell at?

The last few years have been a point of inflection in the area of personal assistants or PIP (Personal Informational Programs). They have gained a voice of their own – to say the least. Voice enabled assistants or voice assists are an evolution in human-machine interactions. When I say, “I speak with Alexa,” people are no longer surprised. They are just confused – am I referring to Alexa service or to a real person! Now that’s what I call the first step in machine takeover – the blurring!

Some serious business:

At Tredence, we have been experimenting with Alexa for a couple of months now. What started out as an exploratory process (Who and What is Alexa and How can I have a communication with her) has led to a more objective driven program. We like to call this Voice Enabled Insights (VEI).

By integrating Alexa with Tableau, we have managed to provide a short synthesis of how the business is performing. And the best part, the insights are refreshed every morning. Operational managers can now have free-wheeling conversation with this voice enabled feature, enhanced with tableau. What a way to consume your morning coffee insights! The icing on the cake is that our system also crawls the web to provide competitor information so you cover the complete landscape. And the, if you want to discuss, you can ask humble Alexa to schedule a meeting with your required stakeholders (let’s say territory manager) through O365 integrations.

So far, we have taken a small step towards a future that is closely integrated, connected and alive all the time – thanks to voice enablement. Looking into the future, imagine a situation where a patient’s family speaks to the panel and ask for the patient’s condition! They are received with prompt information on the room, current health parameters and in-operation status, if the patient is being recorded live. No more long waiting time and anxiety attacks at the help desk.

How about doctors? They can optimize their time by getting critical patients conditions and issuing necessary instructions to nurses, in near real time. The same goes for any enterprise where there is a lot of personal interactions between service provider and consumer.

Now that we covered the most important aspect from personal standpoint – ‘Health’, let’s move to industrialization and the phenomenon of IoT. There have been rapid advancements in the areas of machine to machine communication and the so-called intelligent machines. Add a larynx (voice-enabled feature) to this combination and I can simply step up to a panel and enquire: what has been the output so far, are there any issues with the systems, and issue commands to reroute if there is a line fault. All of this without even lifting a finger, literally “speaking”!

In most cases, what we discussed is the benefit of the voice-enabled feature in a B2B or B2C scenarios. But this is not just it. The corner room assistant can help provide on-demand and interactive directory services, serve as a knowledge bank, and project manage. She can facilitate efficiency, timely decisions, and can also gamify training using skill and stories based mode for self-learning. Simply put, all we need to be is creative; the tools are already getting quite smart to say the least.

It is a given today that Alexa and other services are changing the world and how we interact with it. With time to act constantly getting shorter, these disruptive innovations will play a greater role in how connected we are. Voice enabled insights, while not new in concept (remember IVR’s), is beginning to gain popularity owing the rapid propagation of machine learning and artificial intelligence. They are simply becoming more human in their interactions. It would be wise to get on the race sooner. But here’s the deal, start out on the journey in incremental ways and then scale. Soon there will be a time where we will adjectivize and say, ‘Just Ask Alexa!’

The post Just ask Alexa, The machine in the corner room appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/just-ask-alexa-the-machine-in-the-corner-room/feed/ 1
Second spin: Driving efficiency and happiness through PMO http://www.teerthexport.com/blog/second-spin-driving-efficiency-and-happiness-through-pmo/ http://www.teerthexport.com/blog/second-spin-driving-efficiency-and-happiness-through-pmo/#respond Tue, 19 Sep 2017 07:36:12 +0000 http://www.teerthexport.com/?p=1256 In my previous blog, we looked at how the Project Management Organization (PMO) at Tredence enables excellence across projects. In gist, traditional PMOs focus on ensuring projects are completed on schedule, and processes are followed the right way. At Tredence, the PMO group allows improved...

The post Second spin: Driving efficiency and happiness through PMO appeared first on Tredence.

]]>
Sanat Pai Raikar
Sanat Pai Raikar
Senior Manager, Tredence

In my previous blog, we looked at how the Project Management Organization (PMO) at Tredence enables excellence across projects. In gist, traditional PMOs focus on ensuring projects are completed on schedule, and processes are followed the right way. At Tredence, the PMO group allows improved project planning, monitoring and control.

In this blog, we will look at how PMO at Tredence drives efficiency on a day-to-day basis, which in turn drives improved work-life balance for employees, as well as improved quality and satisfaction for our clients.

Fostering an efficiency based mindset is key – constant improvement manifests itself not just in improved quality, but better job satisfaction as well

Stuck in a rut

Analytics services teams typically follow two modes of operation – medium-term to long-term projects to solve specific business problems, and continued engagements to answer quick turnaround requests from clients. The latter typically involve same day deliverables, which lead to a constant time crunch situation for teams. Teams working on such projects have to, in a way, complete a mini analytics project within a day. This leads to immense pressure in planning one’s day and completing all tasks as per client needs. As time passes, employees in such teams face a burnout as they work day in and day out on similar tasks. Besides, a tendency to be able to do the job eyes shut also creeps in, leaving no room for innovation in the interest of urgent deliverables.

Tracking without tracking

As soon as a process or standard method of doing a set of tasks is introduced, it is immediately countered with resistance from employees, who are used to working without processes. So, if I compelled all employees to, say, track their time on an hourly basis and penalize them for all slips from the plan, I can guarantee that no one will follow it; even if they do, it will be with utmost reluctance and copious stress to themselves.

Alternately, imagine I set a guideline to the tune of “We will all endeavor to leave by 7 PM every day.” No pressure here! But if an employee is not consciously trying to improve, and then observes most of his colleagues leaving before 7(PM), chances are he will start thinking about following the “best practice” himself. This is a passive way of fostering efficiency and change management.

One can define a hundred processes in the interest of efficiency improvement, but unless individual employees buy in to the concept, it will all fail

Passive is not enough

Of course, it will not do to expect things to improve of their own accord. The above strategy can at best lead to incremental improvements, and at worst not help matters at all. PMO needs to actively foster a culture of continuous improvement. At Tredence, we have worked closely with delivery teams to help them identify the sources of inefficiency. These could be external causes, such as latencies linked with client based infrastructure, or traffic woes at rush hour. Causes could be internal as well, such as promising more than we could deliver, or going about work in a non-optimal manner. By quantizing the time lost due to each of these causes, we have directly addressed the reasons for inefficiency, fixed them to the extent possible, and created time for employees.

Out of the rut

Once employees realize that the organization is bought into the concept of helping them gain more time out of a day, they buy into the initiatives as well. The value they see coming out of such initiatives justifies the time they spend on providing data / reports for further improvement. As this percolates across levels, employees feel empowered to innovate themselves and the work they do on a daily basis, continuously making themselves as well as their colleagues better.

At Tredence, we have enabled multiple teams to identify causes of inefficiency and act on these with improvement goals in mind. The time saved has enabled employees to invest not just in providing more value-added services to our clients, but also to themselves – utilizing the time for learning new skills, improving themselves and getting better at what they do.

How does the PMO team in your organization go beyond just process excellence? Share your thoughts and best practices with us in the comments section.

The post Second spin: Driving efficiency and happiness through PMO appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/second-spin-driving-efficiency-and-happiness-through-pmo/feed/ 0
A new spin to PMO: Driving excellence in a complex business environment http://www.teerthexport.com/blog/a-new-spin-to-pmo-driving-excellence-in-a-complex-business-environment/ http://www.teerthexport.com/blog/a-new-spin-to-pmo-driving-excellence-in-a-complex-business-environment/#respond Fri, 28 Jul 2017 13:52:55 +0000 http://www.teerthexport.com/?p=1104 Go to any of the myriad analytics services providers that proliferate the industry today, walk up to any manager, and ask him if any of the analytics projects he works on is similar to the other. Chances are extremely remote that you will receive a response in the affirmative...

The post A new spin to PMO: Driving excellence in a complex business environment appeared first on Tredence.

]]>
Sanat Pai Raikar
Sanat Pai Raikar
Senior Manager, Tredence

Go to any of the myriad analytics services providers that proliferate the industry today, walk up to any manager, and ask him if any of the analytics projects he works on is similar to the other. Chances are extremely remote that you will receive a response in the affirmative.

Let’s go one step further. Ask the manager how easy it is to hire people with the right skills for different projects, ensure they learn on the job, while being efficient all through. Be prepared for a long rant on the complexities and vagaries of finding good talent and utilizing it to the fullest.

PMO enables application of what we sell, analytics, to our own processes for betterment and continuous improvement

Challenges at scale

You would have figured out by now that analytics services companies enable their clients to solve complex business problems. And since each business problem is unique, the approach taken to solve it becomes unique as well. This leaves us with a large set of unique, mutually exclusive analytics projects running at any given point in time; each requiring a separate set of resources, time and infrastructure.

Small analytics organizations can handle this complexity because of multiple factors – a very strong and smart core team, fewer projects to manage, and lower layers of hierarchy within the organization. But as the analytics services company grows, it becomes increasingly difficult to ensure each project is running efficiently and on the right track. The problem is exacerbated by two facts: the flexibility of a startup is not easily scalable; and resistance in putting process to bring some order in to the system is something employees – especially old timers – chafe at. This is where the prominence of PMO kicks in.

Setting up, and moving beyond the traditional PMO

When a startup evolves into a mature, established analytics services company, it usually veils the fact that the company lacks strong processes to scale. In the absence of organization-wide standard processes for running projects, processes in silos start to take form, or in some cases the absence of it altogether.

But this leads to inconsistencies in how project delivery is executed. Similar projects are often estimated in different and sometimes erroneous ways; projects are staffed with people who don’t have the right skills, and knowledge often gets lost when team members attrite. Adding to the list of pains, projects don’t get invoiced in time, invoicing schedules are not consistent, and many projects are executed without formal contracts in place. Senior leadership also lacks a common view into the health of project delivery and the pulse of resources working on these projects, at the ground level.

A good PMO organization faces the same problems as a kite flyer – too many processes, and the kite will never take off; too few, and the kite flies off into the wind. But kite flying technique is important as well.

The focus of a traditional Project Management Organization (PMO) is more towards ensuring projects are completed on schedule, and processes are followed the right way. However, for true maturity in delivering analytics services, PMO needs to move beyond just process focus. It should allow improved project planning, monitoring and control

It should ensure the right issues are identified at the right time and addressed accordingly. It should ensure people across the organization speak the same language and terms, and provide the leadership team a single view into business performance. At the tactical level, a PMO group should help employees become more efficient and process-oriented. It should foster a culture of accountability, automation and quality control to ensure improved satisfaction for clients as well.

The right level of process

Setting up a PMO group is only half the battle won. The PMO setup needs to regulate the proverbial oxygen flow so employees don’t feel constricted in a mire of process bureaucracy; or on the other hand continue in a false euphoria of individual project flexibility. Internal change management needs to be a smooth process. While adding processes layer by layer, care needs to be taken to ensure that employees do not feel “pained” by the PMO “demands”, in addition to their day to day deliverable.

At Tredence, the PMO drives improved quality and timeliness of work outputs, while also serving as a means to achieve work-life balance for our employees. Through a well-planned alignment of employees to the projects, which best match their skills, we ensure each team is best equipped to deliver more than the promised results to our clients. In our next blog, we shall discuss in more detail how our PMO group drives improved efficiencies within Tredence and makes our employees more efficient and happy.

So what does the PMO role in your organization look like? Share your thoughts and best practices with us in the comments section.

The post A new spin to PMO: Driving excellence in a complex business environment appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/a-new-spin-to-pmo-driving-excellence-in-a-complex-business-environment/feed/ 0
Data Lakes: Hadoop – The makings of the Beast http://www.teerthexport.com/blog/data-lakes-hadoop-the-makings-of-the-beast/ http://www.teerthexport.com/blog/data-lakes-hadoop-the-makings-of-the-beast/#comments Thu, 08 Jun 2017 08:24:43 +0000 http://www.teerthexport.com/?p=965 1997 was the year of consumable digital revolution - the year when cost of computation and storage decreased drastically resulting in conversion from paper-based to digital storage. The very next year the problem of Big Data emerged. As the digitalization of documents far surpassed the estimates...

The post Data Lakes: Hadoop – The makings of the Beast appeared first on Tredence.

]]>
Ganesh Moorthy
Ganesh Moorthy
Director – Engineering, Tredence

1997 was the year of consumable digital revolution – the year when cost of computation and storage decreased drastically resulting in conversion from paper-based to digital storage. The very next year the problem of Big Data emerged. As the digitalization of documents far surpassed the estimates, Hadoop was the step forward towards low cost storage. It slowly became synonymous and inter-changeable with the term big data. With explosion of ecommerce, social chatter and connected things, data has exploded into new realms. It’s not just the volume anymore.

In part 1 of this blog, I had set the premise that the market is already moving from a PPTware to dashboard and robust machine learning platforms to make the most of the “new oil”.

Today, we are constantly inundated with terms like Data Lake and Data Reservoirs. What do these really mean? Why should we care about these buzz words? How does it improve our daily lives?

I have spoken with a number of people – over the years – and have come to realize that for most part they are enamoured with the term, not realizing the value or the complexity behind it. Even when they do realize, the variety of software components and the velocity with which they change are simply incomprehensible.

The big question here would be, how do we quantify Big Data? One aspect to pivot is that it is no longer about the volume of data you collect, rather the insight through analysis that is important. Data when used for the purpose beyond its original intent can generate latent value. Making the most of this latent value will require practitioners to envision the 4V’s in tandem – Volume, Variety Velocity, and Veracity.

Translating this into reality will require a system that is:

  • Low cost
  • Capable of handling the volume load
  • Not constrained by the variety (structured, unstructured or semi-structured formats)
  • Capable of handling the velocity (streaming) and
  • Endowed with tools to perform the required data discovery, through light or dark data (veracity)

Hadoop — now a household term — had its beginnings aimed towards web search. Rather than making it proprietary, the developers at Yahoo made a life-altering decision to release this as open-source; deriving their requisite inspiration from another open source project called Nutch, which had a component with the same name.

Over the last decade, Hadoop with Apache Software Foundation as its surrogate mother and with active collaboration between thousands of open-source contributors, has evolved into the beast that it is.

Hadoop is endowed with the following components –

  • HDFS (Highly Distributed File System) — which provides centralized storage spread over number of different physical systems and ensures enough redundancy of data for high availability.

  • MapReduce — The process of distributed computing on available data using Mappers and Reducers. Mappers work on data and reduce it to tuples and can include transformation while reducers take data from different mappers and combines them.

  • YARN / MESOS – The resource managers that control availability of hardware and software processes along with scheduling and job management with two distinct components – Namely ResourceManager and NodeManager.

  • Commons – Common set of libraries and utilities that support other Hadoop components.

While the above forms the foundation, what really drives data processing and analysis are frameworks such as Pig, Hive and Spark for data processing along with other widely used utilities for cluster, meta-data and security management. Now that you know what the beast is made of (at its core) – we will cover the dressings in the next parts of this series. Au Revoir!

The post Data Lakes: Hadoop – The makings of the Beast appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/data-lakes-hadoop-the-makings-of-the-beast/feed/ 1
From the norm to unconventional analytics: Beyond owning, to seeking data http://www.teerthexport.com/blog/norm-unconventional-analytics-beyond-owning-seeking-data/ http://www.teerthexport.com/blog/norm-unconventional-analytics-beyond-owning-seeking-data/#respond Thu, 18 May 2017 14:02:54 +0000 http://www.teerthexport.com/?p=720 The scale of big data, data deluge, 4Vs of data, and all that’s in between… We’ve all heard so many words adjectivized to “Data”. And the many reports and literature has taken the vocabulary and interpretation of data to a whole new level. As a result, the marketplace is split into ...

The post From the norm to unconventional analytics: Beyond owning, to seeking data appeared first on Tredence.

]]>
Shashank Dubey
Shashank Dubey
Co-founder and Head of Analytics, Tredence

The scale of big data, data deluge, 4Vs of data, and all that’s in between… We’ve all heard so many words adjectivized to “Data”. And the many reports and literature have taken the vocabulary and interpretation of data to a whole new level. As a result, the marketplace is split into exaggerators, implementers, and disruptors. Which one are you?

Picture this! A telecom giant decides to invest in opening 200 physical stores in 2017. How do they go about solving this problem? How do they decide the most optimal location? Which neighbourhood will garner maximum footfall and conversion?

And then there is a leading CPG player trying to figure out where they should deploy their ice cream trikes. Now mind you, we are talking impulse purchase of perishable goods. How do they decide the number of trikes that must be deployed and where, what are the flavours that will work best in each region?

In the two examples, if the enterprises were to make decisions based on the data available to them (read owned data), they would make the same mistakes day in and day out – of using past data to make present decisions and future investments. The effect of it stares at you in the face; your view of true market potentials remains skewed, your understanding of customer sentiments is obsolete, and your ROI will seldom go beyond your baseline estimates. And then you are vulnerable to competition. Calculated risks become too calculated to game change.

Disruption in current times posits enterprises to undergo a paradigm shift; from owning data to seeking it. This transition requires a conscious set-up:

Power of unconstrained thinking

As adults, we are usually too constrained by what we know. We have our jitters when it comes to stepping out of our comfort zones – preventing us from venturing into the wild. The real learning though – in life, analytics or any other field for that matter – happens in the wild. To capitalize on this avenue, individuals and enterprises need to cultivate an almost child-like, inhibition-free culture of ‘unconstrained thinking’.

Each time we are confronted with unconventional business problems, pause and ask yourself: If I had unconstrained access to all the data in the world, how would my solution design change; What data (imagined or real) would I require to execute the new design?

Power of approximate reality

There is a lot we don’t know and will never know with 100% accuracy. However, this has never stopped the doers from disrupting the world. Unconstrained thinking needs to meet approximate reality to bear tangible outcomes.

Question to ask here would be – What are the nearest available approximations of all the data streams I dreamt off in my unconstrained ideation?

You will be amazed at the outcome. For example, the use of Yelp to identify the hyperlocal affluence of catchment population (resident as well as moving population), estimating the footfall in your competitor stores by analysing data captured from several thousand feet in the air.

This is the power of combining unconstrained thinking and approximate reality. The possibilities are limitless.

Filter to differentiate signal from noise – Data Triangulation

Remember, you are no longer as smart as the data you own, rather the data you earn and seek. But at a time when data is in abundance and streaming, the bigger decision to make while seeking data is identifying “data of relevance”. An ability to filter signals from noise will be critical here. In the absence of on-ground validation, Triangulation is the way to go.

The Data ‘purists’ among us would debate this approach of triangulation. But welcome to the world of data you don’t own. Here, some conventions will need to be broken and mindsets need to be shifted. We at Tredence have found data triangulation to be one of the most reliable ways to validate the veracity of your unfamiliar and un-vouched data sources.

Ability to tame the wild data

Unfortunately, old wine in a new bottle will not taste too good. When you explore data in the wild – beyond the enterprise firewalls – conventional wisdom and experience will not suffice. Your data scientist teams need to be endowed with unique capabilities and technological know-how to harness the power of data from unconventional sources. In the two examples mentioned above – of the telecom giant and CPG player – our data scientist team capitalized on the freely available hyperlocal data to conjure up a great solution for location optimization; from the data residing in Google maps, Yelp, and satellites.

Having worked with multiple clients, across industries, we have come to realize the power of this approach – of owned and seeking data; with no compromise on data integrity, security, and governance. After all, game changer and disruptors are seldom followers; rather they pave their own path and chose to find the needle in the haystack, as well!

Does your organization disrupt through the approach we just mentioned? Share your experience with us.

The post From the norm to unconventional analytics: Beyond owning, to seeking data appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/norm-unconventional-analytics-beyond-owning-seeking-data/feed/ 0
Making the Most of Change (Management) http://www.teerthexport.com/blog/change-management/ http://www.teerthexport.com/blog/change-management/#respond Thu, 18 May 2017 12:51:24 +0000 http://www.teerthexport.com/?p=713 “Times have changed.” We’ve heard this statement ever so often. Generations have used it to exclaim “things are so complicated (or simple) these days,” or expressing disdain – “oh, so they think they are a cool” generation. Whichever way you exclaim, change has been truly the “constant”....

The post Making the Most of Change (Management) appeared first on Tredence.

]]>

Sulabh Dhall
Associate Director

“The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.”

– Alvin Toffler

“Times have changed.” We’ve heard this statement ever so often. Generations have used it to exclaim “things are so complicated (or simple) these days,” or expressing disdain – “oh, so they think they are a cool” generation. Whichever way you exclaim, change has been truly the “constant”.

This change is bolstered by a tech-enabled world where the speed at which machines are learning is accelerating – the speed of light.

Let me set this in context with an example from the book of Sales. Unlike in the past, today sales reps are not gauged by the amount of sweat trickling down their foreheads. While they continue to be evaluated in terms of business development and lead conversions, it is not all manual and laborious. Technology advancements have made the process of identifying, prioritizing, scheduling, conversing and converting agile and real-time.

But just knowing change, gathering data and appreciating technology will not suffice. The three will need to be blended seamlessly to yield transformation. Applied to deeper organizational context, “Change” needs to be interpreted – its pace needs to be matched, or even better, its effect needs to be contextualized for differentiation.

Change management in this sense is the systematization of the entire process; right from the acceptance of change to its adoption and taking advantage of it to thrive in volatile times.

But what would it take for complex enterprises, that swear by legacy systems, to turbo charge into the Change Management mode?

To answer this, I will humanize enterprise change management with the Prosci-developed ADKAR Model.

Awareness (getting into the race) – Where can I set up the next retail store, what is the most optimal planogram, how do I determine the right marketing mix, what is my competition doing different, how do I improve customer experience, how do I ensure sales force effectiveness – the questions are ample. By the time you realize and start strategizing, a competitor has dislodged your market position and eaten a large portion of your pie. And while these business problems seem conventional, volatility in the marketplace cry foul. Compound this with high dependencies on dashboards, applications, and the likes for insights, and you’ve seen the side-effects – established enterprises biting the dust.

To survive, organizations will need to be knowledgeable about data that matter viz a viz the noise. They will need to interpret the data deluge in relevance and context; after all, not all data is diamond.

Desire (creating a business case for adoption) – Desire is a basic human instinct. Our insatiable urge to want something more, even better, accentuates this instinct. When it comes to enterprises, this desire is no different; to stay ahead of the curve, to make more profits, to be leaders. But there is no lock-and-key fix to achieve this mark. Realizing corporate “desire” will require a cultural and mindset shift across the organization – top-down. And so, one of the most opportune times could be when there are changes at the leadership, followed by re-organization in the rungs below.

Gamification could be a great starting point to drive adoption in such cases. Allow the scope of experimentation to creep in; invest consciously in simmer projects; give a freehand to analysts to look for the missing piece of the puzzle outside their firewall; incentivize them accordingly. Challenge business leaders to up their appreciation for the insights generated, encourage them to get their hands down and dirty when it comes to knowing their source, ask the right questions and challenge status quo – not just rely on familiarity and past experiences.

Knowledge and Ability (From adoption to implementation) – In business context, “desire” typically translate into business goals – revenue, process adoption, automation, newer market expansion, launch of a new product/solution, etc. Mere awareness of the changes taking place does not translate into achievements. It needs to be studied and change management needs to be initiated.

But how can you execute your day job and learn to change?

The trick here will be to make analytics seamless; almost second nature. Just as the message alert from the bank about any suspicious transaction made on your account, any deviation from the set course of business action needs to be alerted.

Such technology-assisted decisions are the need of today and the future. Tredence CHA solution is an example in this direction. It is intuitive, convenient and evolving, mirroring aspects of Robotics Process Automation (RPA).

Reinforcement (Stickiness will be key) – Your business problems are yours to know and yours to solve. Like my colleague mentioned in his blog, a one size fits all solution does not exist. Solving the business challenges of today requires going to the root cause of it, understanding the data sources available to you, and being knowledgeable about other data combinations (across the firewall or within) that matter. Match this stream of data with relevant tools and techniques that can give you the “desired” results.

Point to keep in mind during this drill is to ensure that you marry the old and new. Replacing a legacy system with something totally new could leave a bad taste in your mouth – with less adoption and greater resistance. Embedded analytics will be key – one that allows you to seamlessly time travel between the past, present and future.

To conclude, whether it is about time to implement change, improving customer service, reducing inefficiencies, or mitigating the negative effect of volatile markets, Change Management will be pivotal. It is a structured, on-going process to ensure you are not merely surviving, rather thriving in change.

The post Making the Most of Change (Management) appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/change-management/feed/ 0
Key to bridging the analytics-software chasm: iterative approach + customized solutions, leading to self-service BI http://www.teerthexport.com/blog/key-bridging-analytics-software-chasm-iterative-approach-customized-solutions-leading-self-service-bi/ http://www.teerthexport.com/blog/key-bridging-analytics-software-chasm-iterative-approach-customized-solutions-leading-self-service-bi/#comments Fri, 05 May 2017 15:02:48 +0000 http://www.teerthexport.com/?p=682 The world of software development and IT services have operated through well-defined requirements, scope and outcomes. 25 years of experience in software development have enabled IT services company to significantly learn and achieve higher maturity. There are enough patterns and standards...

The post Key to bridging the analytics-software chasm: iterative approach + customized solutions, leading to self-service BI appeared first on Tredence.

]]>
Ganesh Moorthy
Ganesh Moorthy
Director – Engineering, Tredence

The world of software development and IT services have operated through well-defined requirements, scope and outcomes. 25 years of experience in software development have enabled IT services company to significantly learn and achieve higher maturity. There are enough patterns and standards that one can leverage in-order to avoid scope-creep and make on-time delivery and quality a reality. This world has a fair order.

It is quite contrary to the Analytics world we operate in. Analytics as an industry itself is a relatively new kid on the block. Analytical outcomes are usually insights generated from historical data viz. a viz. descriptive and inquisitive analysis. With the advent of machine learning, the focus is gradually shifting towards predictive and prescriptive analysis. What usually takes months or weeks in software development usually takes just days in the Analytics world. At best, this chaotic world posits the need for continuous experimentations.

The question enterprises need to ask is “how to leverage the best of both worlds to achieve the desired outcomes?”, “how do we bridge this analytics-software chasm?”

The answers require a fundamental shift in perception and approach towards problem solving and solution building. The time to move from what is generally a PPTware (in the world of analytics) to dashboards and furthermore a robust machine learning platform for predictive and prescriptive analyses needs to be as short as possible. The market is already moving towards this said purpose in the following ways:

  1. Data Lakes – These are on-premise and built mostly with the amalgamation of open source technologies and existing COST software’s – homegrown approach that provides single unified platform for rapid experimentation on data along with capability to move quickly towards scaled solutions
  2. Data Cafes / Hubs – Cloud-based SAAS-based approach that allows everything from data consolidation, analysis to visualizations
  3. Custom niche solutions that serve specific purpose

Over a series of blogs, we will explore the above approaches in detail. These blogs will give you an understanding of how integrated and inter-operable systems rapidly allow you to take your experiments towards scaled solutions, in matter of days and in a collaborative manner.

The beauty and the beast are finally coming together!

The post Key to bridging the analytics-software chasm: iterative approach + customized solutions, leading to self-service BI appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/key-bridging-analytics-software-chasm-iterative-approach-customized-solutions-leading-self-service-bi/feed/ 6
SOLUTIONS, WHAT’S NEW? http://www.teerthexport.com/blog/solutions-whats-new/ http://www.teerthexport.com/blog/solutions-whats-new/#comments Thu, 06 Apr 2017 08:46:02 +0000 http://www.teerthexport.com/?p=444 The cliché in the recent past has been about how industries are racing to unlock the value of big data and create big insights. And with this herd mentality comes all the jargons in an effort to differentiate. Ultimately, it is about solving problems. In the marketplace abstraction of problem...

The post SOLUTIONS, WHAT’S NEW? appeared first on Tredence.

]]>
Sagar Balan
Sagar Balan
Principal, Customer Success

Dell, HP, IBM have all tried to transform themselves from being box sellers to solution providers. Then, in the world of Uber, many traditional products are fast mutating into a service. At Walmart, it is no longer about grocery shopping. Their pick and go service tries to understand more about your journey as a customer, and grocery shopping is just one piece of the puzzle.

There’s a certain common thread that run across all three examples. And it’s about how to break through the complexity of your end customer’s life. Statistics, machine learning, artificial intelligence can’t maketh the life of store managers at over 2000 Kroger stores across the country any simpler. It sounds way too complex.

Before I get to the main point, let me belabor a bit and humor you on other paradigms floating around. Meta software, Software as a Service, cloud computing, Service as a Software… Err! Did I just go to randomgenerator dot com and get those names out? I swear I did not.

The cliché in the recent past has been about how industries are racing to unlock the value of big data and create big insights. And with this herd mentality comes all the jargons in an effort to differentiate. Ultimately, it is about solving problems.

In the marketplace abstraction of problem solving, there’s a supply side and a demand side.

The demand side is an overflowing pot of problems. Driven by accelerating change, problems evolve really fast and newer ones keep popping up. Across Fortune 500 firms, there are very busy individuals and teams running businesses the world over, grappling with these problems. Ranging from store managers in a retail store, to trade promotion manager in a CPG firm, a district sales manager in a pharma firm, a decision engineer in a CPG firm and so on. For these individuals, time is a very precious commodity. Analytics is valuable to them only when it is actionable.

On the supply side, there are complex math (read algorithms), advanced technology and smart people to interpret the complexities. And, for the geek in you, this is a candy store situation. But, how do we make these complex math – machine learning, AI and everything else – actionable?

To help teams/individuals embrace the complexity and thrive in it, nature has evolved the concept of solutions. Solutions aim to translate the supply side intelligence into simple visual concepts. This approach takes intelligence to the edge, thereby scaling decision making.

So, how do solutions differ from products, from meta-software, service as a software and the gibberish?

Fundamentally, a solution is meant to exist as a standalone atomic unit – with a singular purpose of making the lives of decision makers easy and simple. It is not created to scale creation of analytics.
For example a solution created to detect anomalies in pharmacy billing will be designed to do just that. The design of this solution will not be affected by the efficiency motivation to apply it to a fraud detection problem as well. Because the design of a solution is driven by the needs of the individual dealing with the problem, it should not be driven by the motivation to scale the creation of analytics. Rather, it should be driven by the motivation to scale the consumption of analytics; to push all the power of machine learning and AI to the edge.

In Tredence you have a partner who can execute the entire analytical value chain and deliver a solution at the end. No more running to the IT department with a deck/SAS/R/Python code, asking them to create a technology solution. Read more about our offerings here.

This blog is the first of the two-part series. The second part will be about spelling the S.O.L.U.T.I.O.N.

The post SOLUTIONS, WHAT’S NEW? appeared first on Tredence.

]]>
http://www.teerthexport.com/blog/solutions-whats-new/feed/ 1
米奇影院888奇米色