If you could use data to:

    • grow revenue
    • reduce costs
    • boost efficiency

What impact would that have on your business?

An extra 16 milllion

tied up in working capital each month

Dear CEO, Founder, Business Owner,

Many businesses and organizations, just like yours, are sitting on a treasure trove of data.

But lack the expertise to unlock its full potential.

As a result, they miss out on opportunities to:

    • increase revenue,
    • reduce costs,
    • and improve overall efficiency.

Data often remains:

    • unused,
    • inaccessible,
    • or not business-ready.

Leading to:

    • lost earnings,
    • missed opportunities,
    • and sub-optimal decision-making.

Ignoring the power of your data means your business will continue:

    • to waste time and resources on managing bad data,
    • miss opportunities for growth,
    • and make poor decisions due to incomplete or untrustworthy information.

This not only affects your current performance.

But also risks your future success.

Your competitors who harness the power of data will:

    • outperform you,
    • taking your customers,
    • and running processes at lower costs with higher efficiency.

The window to leverage data effectively is closing rapidly.

And failing to act now can result in:

    • significant setbacks
    • and missed opportunities for innovation and growth.

A large company with a diversified customer base lacked the expertise to unlock the full potential of its accounts receivable A/R data.

Their A/R is a vital component of their working capital used for its day-to-day operations.

Capital essential for maintaining smooth business functions.

They wanted a better way to keep close tabs on their A/R data.

A way that easily made sense of this data.

They were advised to track their accounts A/R data on a Process Behavior Chart or control chart. 

A tool for monitoring the stability of business processes over time. 

It’s designed to distinguish between:

    • common cause variation: the normal, small fluctuations in accounts receivable due to routine business activities (like customers paying slightly earlier or later than usual).
  •  
    • special cause variation: the unusual, significant fluctuations caused by specific events or external factors (like a major customer delaying payment due to unforeseen circumstances).

Here’s why it’s useful:

    • Guides improvement: Helps pinpoint where to make adjustments.
    • Tracks performance: It shows how a process performs over time.
    • Detects changes: Identifies when something unusual happens.
    • Easy to use: No data science team needed
    • Easy to understand: No degree required

The relevant data for one year are shown in Table 1.0 below, where the amounts are in millions of dollars.

Since the accounts receivable vary as the sales level varies, Table 1.0 also shows the total sales for the past three months.

Instead of tracking the accounts receivable in dollars, they tracked the accounts receivable as a percentage of the sales for the past three months.

This adjustment of one number by another is common with all sorts of business data.

The average for the accounts receivable percentages is 56.18 percent.

Using the monthly percentages to obtain the moving ranges (shown as the mR values in Table 1.0), the average of the 11 moving ranges is 1.69.

Using these values, the limits for a Process Behavior Chart were computed to be:

    • Upper Natural Process Limit (UNPL) = 56.18 + (2.66 x 1.69) = 60.68
    • Lower Natural Process Limit (LNPL) = 56.18 – (2.66 x 1.69) = 51.68
    • Upper Range Limit (URL) = 3.27 x 1.69 = 5.53

By adopting this data-driven approach, they transformed their data management processes.

They not only unlocked the hidden value in their data.

But also leveraged it to drive:

    • innovation,
    • improve customer satisfaction,
    • and achieve substantial growth.

For instance, they started tracking accounts receivable as a percentage of sales for the past three months, adjusting their metrics for more accurate insights.

This adjustment helped them maintain consistent monitoring and better decision-making.

By implementing Process Behavior Charts, they computed control limits and maintained their accounts receivable percentages within the acceptable range, ensuring financial stability and operational efficiency.

Figure 1.0 Process Behavior Chart of Accounts Receivable Data

The Process Behavior Chart in Figure 1.0 makes three things immediately apparent:

    1. The accounts receivable have averaged about 56 percent of the three-month sales values.
    2. Allowing for routine variation, these monthly values could climb up to 60 percent or drop to 52 percent without signaling any change in the accounts receivable process.
    3. The December value of 63 percent is a signal of exceptional variation, indicating a change in the accounts receivable process, and it was a change for the worse.

To determine how a change from 56 percent to 63 percent amounts to an extra 16 million dollars being tied up in working capital each month, we can follow these steps:

 

    1. Average Monthly Sales: 2804.8 / 12  = 233.73
    2. A_R at 56%: 233.73 × 0.56 = 130.89 million dollars
    3. A_R at 63%: 233.73 × 0.63 = 147.25 million dollars
    4. Difference: 147.25130.89  = 16.36 million dollars

Not only do you know these facts about the accounts receivable process, but you have the chart to use in communicating these facts to stakeholders.

A change from 56% to 63% amounts to an extra 16 million dollars being tied up in working capital each month, this was a fact worth communicating.

This methodological shift:

    • reduced errors,
    • highlighted trends and anomalies,
    • and provided actionable insights to maintain financial health.

Unlock the potential of your data with a free data strategy consultation.

Our expert team will help you identify:

    • key areas for improvement,
    • develop a tailored strategy,
    • and provide actionable insights to transform your data into a powerful asset that drives growth and success.

Don’t let your data remain a dormant asset.

Start transforming your business with the power of data.

Schedule your 1:1 data strategy consultation today!

And take the first step toward unlocking the full potential of your data.

Sincerely,

Lindsay Alston

continual improvement vs traditional cost reduction

we have significantly reduced errors and rework, leading to substantial cost savings.

Continual Improvement is different from “doing the same old thing” under a new banner.

Continual Improvement can have an impact on your bottom line.

Traditional methods of reducing costs often focus directly on cutting expenses.

This can undermine your future growth and profitability.

This short-term focus can result in:

     ❌ increased errors,
     ❌ rework,
     ❌ and internal competition.

Ultimately harming your organization’s efficiency and customer satisfaction.

Additionally, these methods fail to address underlying issues.

And limits your potential revenue growth and increasing operational risks.

Consider a business operations team facing escalating costs due to inefficiencies and errors in their processes.

Instead of simply cutting budgets, they implement Continual Improvement to analyze and address the root causes of these inefficiencies by:

     ✅ streamlining processes,
     ✅ eliminating errors,
     ✅ and reducing complexity.

They achieve significant cost savings while enhancing overall operational performance.

This shift promotes collaboration between:

     ✅ departments,
     ✅ improves communication,
     ✅ and fosters a culture of trust.

As a result, the organization not only reduces costs but also boosts:

     ✅ revenue growth,
     ✅ mitigates risks,
     ✅ and enhances customer satisfaction.

Making them more reliable and providing a higher quality service.

“Since adopting Continual Improvement, our operations have become much more efficient. We have significantly reduced errors and rework, leading to substantial cost savings. The increased cooperation and trust among our teams have transformed our work environment. Moreover, we’ve seen a notable increase in customer satisfaction and a reduction in operational risks.” – (𝙮𝙤𝙪𝙧 𝙘𝙤𝙢𝙥𝙚𝙩𝙞𝙩𝙤𝙧𝙨)

Your testimony?

Discover how Continual Improvement can transform your organization.

We provide the methodology and techniques needed to:

     ✅ study processes,
     ✅ differentiate types of variation,
     ✅ and cultivate a new way of thinking

This new way of thinking drives:

    ✅ sustainable cost reduction,
    ✅ revenue growth,
    ✅ and improved customer satisfaction.

Ready to see real results on your bottom line?

Navigate to my profile.

Click link ‘Book 1:1 C.I. Consultation’ above or below.

To schedule a consultation.

Start your journey towards a more:

     ✅ efficient,
     ✅ cooperative,
     ✅ and customer-centric organization.

Sincerely,

Lindsay Alston

she needed a solution

to understand and control these variations

  •  

failing to identify and address variations in your sales process can result in significant consequences.

Many businesses struggle to understand and control the variations in their sales processes.

This lack of clarity leads to inefficiencies, unpredictable outcomes, and missed opportunities for improvement.

Without a reliable method to distinguish between routine and exceptional variations, sales managers cannot make informed decisions to enhance sales performance.

Failing to identify and address variations in your sales process can result in significant consequences, such as:

    • Inconsistent sales performance leading to revenue instability.
    • Inability to accurately forecast sales, resulting in poor planning and resource allocation.
    • Missed opportunities to capitalize on successful sales strategies or correct issues promptly.
    • Increased operational costs due to inefficiencies.
    • Overall reduced competitiveness in the market.

Meet Sarah, the Sales Director of a mid-sized retail company.

Sarah noticed that despite her team’s best efforts, their sales performance was inconsistent, leading to unpredictable revenue and planning challenges.

The variations in their sales data were causing difficulties in forecasting and resource allocation.

Sarah knew she needed a solution to understand and control these variations but wasn’t sure where to start.

That’s when Sarah discovered process behavior charts.

By implementing these charts, her team could visualize sales behavior clearly, distinguishing between routine and exceptional variations.

This newfound clarity allowed them to identify specific issues causing sales inconsistencies.

They could address problems swiftly and accurately, leading to a significant improvement in sales stability and forecasting accuracy.

The impact was immediate.

Sales performance became more predictable, resource allocation improved, and the team could capitalize on successful sales strategies while addressing issues promptly.

Sarah’s company gained a competitive edge in the market, and her team was empowered with the insights needed to drive continual improvement.

“Implementing process behavior charts transformed our sales management. We can now identify and address variations swiftly, leading to consistent sales performance and better forecasting.“_ – Jane D., Sales Manager

“The clarity provided by process behavior charts has been a game-changer for us. We’ve improved our sales efficiency and can confidently predict future performance.“_ – Mark S., Sales Director

Visualize and control your sales with process behavior charts.

Our 1:1 data strategy consultation will help you implement process behavior charts effectively, enabling you to distinguish between routine and exceptional variations and drive continual improvement in your sales performance.

Are you ready to take control of your sales variations and enhance your business performance?

Sincerely,

Lindsay Alston

what is a process behavior chart?

The crystal ball for data analysis

A process behavior chart, or control chart, is a tool for monitoring the stability of processes over time.

It’s designed to distinguish between common cause variation (inherent to the process) and special cause variation (due to external factors).

Here’s why it’s useful:

    • Tracks performance: It shows how a process performs over time.
    • Detects changes: Identifies when something unusual happens.
    • Guides improvement: Helps pinpoint where to make adjustments.

Imagine this:

You’re in sales.
You use a process behavior chart to plot data points.
The chart highlights any out-of-control points, signaling a need for investigation.

Ready to improve your process?

Start charting and see the difference.

Have you tried using a process behavior chart?

Sincerely,

Lindsay Alston

Process control chart from qcc package in R. Note the red dot top right corner of chart, that's a signal.

a tale of two approaches

Data distribution not known? Don’t worry…

What is “Normally Distributed Data”?

Normally distributed data forms a bell-shaped curve when plotted on a graph.

Most values cluster around the average, with fewer values appearing as you move further from the average in either direction.

The Purpose of Data Analysis

Data analysis helps filter out random noise to detect meaningful patterns or signals, similar to hearing a specific tune in a noisy room.

The Myth of Needing “Normally Distributed Data

There’s a belief that for process behavior charts (tools to monitor process changes over time) to work, the data must be normally distributed.

This belief dates back to 1935 when E. S. Pearson misunderstood Walter Shewhart’s method of filtering noise.

Pearson’s Statistical Approach

Pearson’s method to filter noise involves:

    1. Choose a Proportion (P): Decide how much noise to filter out, commonly 95% (P=0.95) or 99% (P=0.99).
    2. Identify a Test Statistic (Y): Create a function based on the data.
    3. Find the Probability Model (f(y)): Determine how Y behaves under certain conditions.
    4. Determine Critical Values (A and B): Use the curve’s area to find the points that correspond to P. Any Y value outside this range is considered a signal.

This method works well if we know the correct probability model for our data.

However, we rarely have enough data to accurately determine this model.

Shewhart’s Alternative Approach

Shewhart tackled the problem differently:

    1. Fixed Critical Values: Instead of relying on specific probability models, Shewhart used fixed, generic critical values.
    2. [Average ± 3 Sigma]: He chose symmetric limits around the average, extending three standard deviations (sigma) in both directions.

This method is effective regardless of the data’s distribution and ensures that almost all data points (close to 100%) fall within these limits.

Why Shewhart’s Method Works

Shewhart’s approach avoids the need to determine the exact probability model, making it simpler and more robust for practical use.

By using generic critical values, his method can detect signals effectively without being constrained by the data’s distribution.

While Pearson’s statistical approach is useful when the correct probability model is known, Shewhart’s method provides a practical and flexible alternative.

It simplifies the process of filtering out noise and detecting signals in data, making it highly valuable for real-world applications.

source: here

Sincerely,

Lindsay Alston