Intelligent Metrix

Data to Metrics to Insight to Intelligent Decisions

Ensuring quality data from service providers

For those of us that have lived, eaten, and slept with data quality and data management it is hard to fathom that there are still pockets of those that have yet to define a solid foundation of data quality and data management best practices.  It is even harder still to take a step (leap) back into the roots of how data quality and data management issues all began.  Well, let me tell you, those pockets of organizations are alive and well in the most unlikely places – those companies that are providing data.

To be fair, there are some amazing companies out there that provide information and data that we use to improve and enhance our own data or take to analyze independently.  They may not be perfect (no one is!).  Though, they have defined themselves as servicing organizations with “better quality data” and stand by it with best practices of their own.  But, as enterprise organizations and even mid-sized companies have jumped on the band wagon and adopted sophisticated processes, solutions, and people that are dedicated to better information, there are still a significant number of services providers that lack the skills, tools, and practices that would ensure reliable information to measure our performance, understand our market, and take advantage of new opportunities.

At the end of the day, the data and information we source needs to be reliable.  It is important to guard yourself when both contracting with service providers and when you receive data.  Simply relying on the fact that the data is of high quality when you receive it is not good enough.  You need to be vigilant during the sourcing of providers as well as clearly defining how you can ensure what you received is what you paid for.  Here are some things to consider and ask when working with data providers:

  • How do they collect their information?
  • How do they verify that the information is valid?  What process, sources, and analysis is used?
  • Are they providing data to other customers for the same purpose you need the information for?  How many/what portion?
  • What is their repeat business rate?  Who are their top customers?
  • What purposes are their customers using their data?
  • What do they do to verify and validate your data prior to providing it to you?
  • What do they do to verify that the data they are providing is complete?
  • What guarantees do they or will they provide that the data meets your specifications and quality standards?
  • What is required on your end to validate that the data is accurate and reliable?
  • If you are purchasing tracking data (real time/period feeds), what initial and regular testing processes used to verify proper data transfers?
  • What is required on your end to ensure the data transfer is working initially and ongoing?

What have you done to ensure data from service providers is what you want?

Advertisements

Filed under: Data Quality, , , , ,

Archiving Strategy: Data Relevance

We often think of the relavence of data when we want to include or exclude it from analysis or process.  However, are you thinking about relavence as part of your data quality effort?

Just as you focus data quality efforts to clean existing information, there are invariably records that can’t be cleansed or enhanced.  They have no value in either business analytics or business process.  They are noise, similar to the noise you have when there is bad data.  To save and maintain them in your database can affect your ability to accurately analyze information, continue to deflate confidence in data, and if a significant percentage of your database, will cause problems in performance and added maintenance.  Developing an archival strategy as part of your data quality practice is a significant component that should not be overlooked.

Benefits of Data Relevance

  • Trust in data
  • Enables process
  • Accuracy of analysis
  • Supports decisions
  • Database optimization

It can be tempting to simply delete records from your databases.  Though, this can have a detrimental affect due to data dependencies within your databases as well as causing non-compliance in regulated environments.  Instead, it is best to formulate a strategy that flags non-relevant data removing or suppressing it from user interfaces and analytics.

Components of Archiving Strategy

  • Data decay rates – Attributes of records that loose relevance over time.  This component is a good guide on the frequency at which you will focus cleansing efforts.  It also provides an indicator on when data is approaching a horizon when a record will lose its relevance.  Age of the data and activity related to a record, even if a record is complete, can signify whether the data is relavant and open to archiving.
  • Minimum requirements of record viability – Records should continually be assessed to determine if they meet the minimum standards of use.  Failure to meet minimum requirements is a leading indicator that the record is a candidate for archiving.
  • Relevance of record to analysis, process, decisions – If a record is not going to be used in analysis, process, or decision making, there is not need to keep it in use.  This may be the case if processes have been optimized and certain information is no longer needed.  Or, it could be that it was a candidate for archiving due to decay rates and minimum data requirements.  Additionally, relavance may be determined when integrating systems where old records with old transaction history is not relevant to the existing or new business.
  • Regulatory compliance – In highly regulated environments like health care, there are standards on what you can and cannot remove.  Records may not be useful in existing process, analysis, and decision making, but might be required in certification or other compliance related activities.  Archiving ensures that information is not deleted from primary systems.  Although, you may have to provide a mechanism that provides adequate access to data for compliance.

An archiving strategy is a critical component of data quality best practices.  It will continually help you focus on improving and refining your data quality projects as well as thinking strategically about how you use and manage your data on a daily basis.  Establish an archiving strategy at the forefront of your data quality initiatives and you start your efforts off on the right foot.

Reblog this post [with Zemanta]

Filed under: Uncategorized, , , , , , , , , , ,

Stuck in First Gear

porscheBig investments were made in recent years in IT.  IBM, Oracle/Siebel and SAP lead the market and were successful not only with the multi-national enterprise companies but, also with mid-sized companies.  There are a lot of companies out there that have purchased application and data management/data warehouse solutions only to find themselves using a portion of what it could do.  It’s like driving a Porcshe in first gear.

There are some fundamental reasons for this, outside of the fact that companies may feel it is the fault of their sales execute selling them the wrong bill of goods.  IT will blame the business for not knowing what it wants.  The business will blame IT for not getting it.  Doesn’t really matter, there is plenty of blame to go around.  What matters is that now you have a solution that isn’t giving you the benefits that it really could and should be.

Maybe I’m a bit biased since I’m the data chick.  Well, more than a bit.  Regardless, I think that from a data management perspective, companies are failing.  The maniacal focus on process efficiency has drowned out the fact that process runs on data and feeds data.  This focus has put data in the back seat too long and now when we need it to better understand our customers, our business, and make decisions, it is sorely lacking.  Our data lacks unity, structure, definition, and most of all purpose.  Companies simply cannot leverage their information except at very basic levels.  When things are good, this may be okay.  When things are bad, this is a real problem.

What makes this even more sad, is that companies are looking to spend more money on applications and data infrastructure to ‘fix’ the problem.  The promise of the new model and more sophisticated bells and whistles that will solve anything you throw at it is just marketing.  Until you can understand and control what you already have under your hood, getting something bigger, better, and shinier isn’t going to help anymore than it does now.  So, there was no ROI on existing purchases and there won’t be any ROI on new purchases.

There are two things companies need to do to make the investments in enterprise solutions worthwhile:

  • Clean-up the back-end data management practice so that it is fluid with business process and application usage.
  • Have a clear data management strategy for new applications that is fluid and scalable outside of application databases.

Your company may already be embarking on SOA or MDM projects.  But, have you looked at how these new practices will support applications outside of changing the oil?  Can the data drive process?

Today, applications are bogged down because data is treated as something to put in the trunk and horde.  Until data is thought of as fuel, you’re IT investments will stay in 1st gear and never get to 6th.  Now how fun is that?

Reblog this post [with Zemanta]

Filed under: Uncategorized, , , , , , , ,

How to Measure the Business Impact of Data Quality

So, you want to invest in data quality but you need to prove ROI before you get the resources. Intuitively you know that data quality is impacting your business. How to measure that to make a case is the test.

Many businesses focus on data elements that are easy to see and understand like company and contact information.  However, as obvious as some of these elements may be, they don’t always lead to the highest bang for the buck.  Data elements have priority levels within processes depending on the desired business outcome.  In addition, data elements have dependencies outside of how the information comes into the system.  You need to take this into account as you conduct your business analysis and map your data across your business processes.

During business analysis it pays to establish a foundation that validates recommendations and shows ROI through case studies.  You can do this through data analysis and pilot programs.  Data analysis can be applied through meta data segmentation within processes where you look at the existing state of the data.  You can also improve portions of the data and perform the segmentation and analysis.

These steps will prepare your case but will also help establish dashboards to allocate resources for future projects.

1) Identify the processes you think are most impacted by poor data quality . The processes should be tied into key business functions. For instance, in marketing you may want to look at lead qualification and management. Processes that are well defined and have a tangible link to businesses objectives work best as they are most likely mature and revenue has been tied to them.
2) Pinpoint smoking guns in the processes.  There are bound to be several points in a process that are key indicators of success where data quality has negatively impacted the outcome.  Your business analysis will or should show this clearly.  These smoking guns should be called out clearly in the processes.  What you should determine is which data elements are impacting the most and can be easily focused on or addressed.
3) Select data quality issues that you can segment the process into influence tracks.  This step is critical to measurement.  You need to dissect the process to create scenarios of what good vs. bad looks like in process outcomes.  In the lead management process suggested earlier, it could be the point where you would qualify a lead to move into the sales pipeline.  
4) Measure performance success with good quality vs. poor quality data.  At this stage you should be able to run an analysis that shows the difference in process outcomes and performance when you run scenarios between good quality data and poor quality data.  

The real benefit is that at this stage you’ve provided the dashboard to measure improvements to the business.  Rather than wait until the data quality projects are completed, this provides the foundation for predicting where you will get the most impact from your investments. Instead of focusing solely on metrics that measure the completeness, accuracy, and uniqueness of records, you can focus on how these metrics within processes influence business outcomes.  Now you have a case for linking data quality with ROI.

Filed under: Data Quality, , , , ,

Starting Your Business: Data From the Ground Up

data managementIt is easy when starting up a business to think about selling first, marketing and database management later.  Afterall, revenue is the most important thing to focus on.  Though, once you get over the hump and begin to groove, you realize that data is important.  Now you have to sort through it and it feels worse than diving into list of 300 emails in your daily inbox.  Well, if you have a method to deal with your email inbox, create one for managing customer and contact data.

Here are some simple things you can do up front to stay organized and be better prepared when you are ready to look at and manage your customers and the business in depth.

  • Be consistent about how you collect customer data – There are usually several layers to the importance of customer information elements depending on your relationship.  What you want to do is determine the information that is most critical and collect this consistently across all methods.  Keep in mind that what is mandatory to transaction may be different from what you need to follow-up with customers after a purchase.  So, make sure that you take this into account at the point in time you collect the information.  It is harder and more costly to collect after the fact.
  • Save data elements into dedicated fields – The biggest issue I find with new businesses and small businesses when they need to convert to more robust systems is that data elements are merged together into a single Excel cell.  When collecting contact names, break apart the first and last name into separate fields.   Do the same for addresses having fields for street address, city, state, country, and postal code.
  • Determine what platform has the Master data – The second biggest issue when migrating customers to a robust system is the inability to determine which record is the most valid of duplicate entries.  If you are saving contact and company information between your mobile phone, laptop, website, and company server, which will you consider the single source of record?  Once you determine this, make sure you sync your lists to that source.  I recommend you do this weekly at the least and use your primary server.  Then, include the database in a weekly back-up process.
  • Save, Save, Save – You may have caught this recommendation in the previous bullet.  Backing up is critical.  It is mandatory.  I’ve watch small businesses loose business critical information because they didn’t back up or back up often enough.  There are easy services today that make backing up our information simple.  At the very least, invest in a USB storage device and plug into daily when you sit down and get to work.  Before you do anything, back up.  Make it a habit.

Managing your customer and company information does not have to be difficult or cumbersome.  With a little forethought, when you business gets off the ground and you are ready to invest in better platforms and reporting, you will have a great foundation to do so.

Reblog this post [with Zemanta]

Filed under: business intelligence, CRM, Data Quality, , , , , , , , , ,