Contact Us

Charting the Course from Five-Star Data to “Data as Value”

Andrew Karpie | March 29 2022 |

In today’s fast-moving times and rapidly changing labor markets, data is foundational to helping organizations manage and optimize their integrated, external workforces. Data can be the fuel that boosts the performance of contingent workforce programs to higher (and eventually superior) levels.

What can be accomplished with data depends on what the data consists of and its integrity, management and transformation into information and intelligence that drives value-in-use. But organizations and business users don’t just want data outputs – they want what could be called “data as value.” (Related reading: “Leveraging Five-Star Data.”)

Data as Value

For organizations, “data as value” refers to the use of data to achieve valuable contingent workforce management outcomes, such as:

  • Lower costs
  • Improved compliance
  • Better talent fill rates, quality and time to fill
  • Increased diversity and inclusion
  • And much more

However, “data as value” also means accomplishing this without heavy lifting, required expertise and significant investments – while being assured there’s high integrity in the data and processes that ultimately transform data into domain- and use-case-specific information, intelligence, insights, decisions and actions.

Five Crucial Dimensions of Data 

Magnit has identified five crucial dimensions of data – guiding stars, if you will – that it considers fundamental to achieving “data as value” in an external workforce management setting. These properties are:

  1. Relevant
  2. Accurate
  3. Timely
  4. Comprehensive
  5. Unbiased

Let’s look at how “data as value” is realized through a focus on these five principles and based on substantial technical and domain expertise, advanced technology and disciplined operations.

Data relevance is a foundation of “data as value” in the form of information/intelligence that leads an organization to substantially better outcomes compared to the status quo (e.g., a significant decrease in time to fill and sourcing cost). Data relevance is the lodestar for seeing what data is needed, how it is processed and organized, and for achieving “data as value.”

Data relevance doesn’t just happen, and it’s certainly not developed in a vacuum. It’s the product of careful, customer-centric needs analysis and discernment. Data relevance is both use case- and role/individual-dependent. It therefore requires communications with end-users and heavy involvement of domain experts, information-solution architects and UI/UX designers.

What constitutes relevant data is linked to business requirements and specific end-user needs in the context of an evolving contingent workforce management industry. For organizations, relevant data and “data as value” are not only outputs of well-designed and engineered data systems, but something that’s co-created with specific client organizations and stakeholders.

Accuracy may seem like a simple concept, but in the world of data management it’s more nuanced. At the ground level, there are basic error checking and data validation routines needed to cleanse data. But it gets more complicated.

For example, a data point stored in a database may represent a “specific observation” (e.g., a single instance of pay rate for an entry-level QA tester in a certain metro area ), or it may be a “parameter estimate” (e.g., an average of a sample distribution drawn from a vast population of pay rates). In general, data points and data series may be generated by formulas based on other data points. The “point” here is that defining and achieving accuracy is nuanced and complex.

Ensuring accuracy requires well-designed and consistently executed procedures to error check, cross check, and consistency check data across a data set and its transformations over time. It also requires optimal data architecture design and management. While it’s another fundamental condition for “data as value,” achieving and ensuring it is challenging and requires significant know-how and resources.

It’s often said that timing is everything, and this is no more the case than in the contingent labor market, where significant changes occur in short time frames. For example, information can change many times over a lifecycle of finding, evaluating, offering and onboarding a talented candidate. Bearing this in mind, data timeliness can be broken down into data “currency” (up to date) and “availability” (access speed).

Current data is that which reflects the most recent available (even a moment ago) and includes corrections to historical data. Available data means how quickly the information can be accessed (ranging from running a report or query to real-time or instantaneous). Significant power and value can be achieved when past data is transformed through machine learning, or very recent data is used to guide decisions and actions.

Ensuring the maximum timeliness of data, like accuracy of data, requires sophisticated design and engineering of data updating and dataflows. It also means establishing the right technology (e.g., DaaS, AI, piping) for transforming data into intelligence that can be accessed and used at the optimal time in a given context. It goes without saying that data timeliness is more important than ever.

Comprehensiveness can be defined in terms of the optimal set of data needed to address the requirements within a particular domain or use case (e.g., domain of contingent workforce management, use case of direct sourcing, etc.). It’s not simply a question of scope and scale; it’s also a matter of design and curation (identifying the data that aligns to the domain or use case’s requirements).

Enterprise technology solutions don’t typically generate the complete, optimal set of data to maximize program or solution value for client organizations/end-users. Accordingly, third-party data sources are a necessary part of the equation. Currently, the sourcing of the required data means partnering and integrating with any number of data suppliers.

The external sources must be evaluated, selected and integrated to achieve the necessary level of comprehensiveness. Furthermore, achieving data comprehensiveness requires the right technology capabilities for integration and data processing. It also requires significant domain, design and technical expertise to make it happen.

Bias is a warping of information that occurs between the point of data generation and the point-of-use, resulting in misguided judgments/conclusions and potentially adverse impacts. Bias is different from inaccuracy in that it’s about how certain sets of assumptions, values or prejudices determine how data is selected, processed and presented for use.

Unbiased data requires the detection and avoidance/mitigation of these influences from the overall data management and use life cycle. This may mean holding back or masking certain information that would introduce bias into the process. It may also mean pushing more relevant information into the user’s primary view, while decreasing the emphasis of bias-creating information that offers little, if any, business value.

In today’s world, badly managed AI has been known to introduce biases into processes such as talent acquisition. But it’s also clearly demonstrated that AI can detect and root out biases as well. Making unbiased data one of the stars of five-star data necessarily means the adoption of practices, tools and human oversight/involvement to stop it in its tracks.

Start the Journey: Data as Value

The emerging, but fundamental, role of data in managing the constantly-in-motion contingent workforce is entirely different – in fact, worlds apart – from that of the past (which has consisted of operational reports, queries and dashboards). Starting now and increasingly going forward, organizations will need to leverage data at a massive scale and derive value from it in thousands of different ways to rise above their competitors in managing their integrated, external workforces.

Organizations will derive applied business value through insights, decisions and processes that are founded on data and technology, methodologies and expertise that transform data into relevant information and powerful intelligence – “data as value.”

Achieving “data as value” is an imperative for organizations that seek to outperform their competitors in terms of maximizing the contributions of their externally supplied workers and services. But this cannot be accomplished by most organizations on their own. Instead, it will require a provider with a platform that can pull together data and a broad range of resources and capabilities, achieve economies of scale, and provide client organizations what they need and want – not just data, but value.

For more on reaping the benefits of top-quality data, check out Magnit’s "Leveraging Five-Star Data” white paper."

If you’re interested in learning more about how Magnit is helping organizations implement winning contingent workforce programs globally, please contact a Magnit representative at

Disclaimer: The content in this blog post is for informational purposes only and cannot be construed as specific legal advice or as a substitute for legal advice. The blog post reflects the opinion of Magnit and is not to be construed as legal solutions and positions. Contact an attorney for specific advice and guidance for specific issues or questions.

Your Evolution of Work Starts Here