Services ERP & Integrated Management Data & Business Intelligence DMS & Collaboration Markets Public Institutions International Organizations Enterprises International Groups Explore Approach References About Academy Insights Demos
Free diagnostic →
NJIADATA Methodology

The Dashboard in 6 Weeks: From Excel to Power BI Without Losing the CEO Along the Way

The consulting industry sells 12-month BI projects. We deliver the first operational dashboard in 6 weeks. Here's exactly how — and why it changes everything.

ND

NJIADATA

Microsoft Data & BI Experts · Paris · Abidjan

14 min read · February 28, 2026

The problem no one talks about

Here is a scenario we have witnessed dozens of times across 20 years of consulting — in Europe first, then in Africa. A CEO signs off on a Business Intelligence project. The budget is approved. A consulting firm is hired. And then the project enters the machine.

9-14
months of project
on average
78
pages of functional
specifications
12
dashboards
delivered
2
dashboards actually
used after 6 months

These figures are orders of magnitude, not published statistics. They reflect what we have consistently observed across dozens of BI projects at European telecom operators, financial institutions, and industrial groups. Gartner and Forrester studies confirm the broad trend: between 60% and 85% of BI projects fail to deliver expected business value, depending on the period and measurement methodology. What matters here is not the exact number — it's the pattern. And the pattern repeats with unsettling regularity: the CEO sees the first dashboard at month 9 of the project. By then, their priorities have shifted. The data has evolved. The consultant specified a system that answers January's questions, but it's October.

The CEO looks at the dashboard, says "interesting," and the next morning still calls the CFO to get the number they need. By phone. Or by WhatsApp.

The project is officially "successful" — delivered on time, on budget, meeting specifications. But functionally, it's stillborn. And no one has an incentive to say so: not the firm that billed for it, not the CIO who championed it, not the CEO who signed the budget.

A dashboard nobody uses is not a technical failure. It's a methodological failure. The technology works. It's the path between the technology and the decision that's broken.

Why 12 months? Because it's profitable — for the consultant

The duration of a BI project is not driven by technical complexity. Power BI can connect to a database and produce a working dashboard in a few hours. The duration is driven by the consulting firm's business model: the longer the project, the more days it bills. A 2-month scoping phase is not technically necessary — but it justifies 40 person-days of senior consultant time at €1,200 per day.

We're not saying scoping is useless. We're saying the traditional approach — interviewing 15 departments, running specification workshops, producing an 80-page document signed off by a committee — is a process designed to minimise the consultant's risk, not to maximise value for the client. The consultant hides behind signed specifications. If the dashboard doesn't match real needs, it's not their fault — they followed the specs.

Our breakthrough: the 6-week sprint

After 20 years watching these failures — and having been complicit in them in our previous lives at major firms — we designed a methodology that flips the model. The first operational dashboard is delivered in 6 weeks. Not a prototype. Not a mockup. A production dashboard, fed by real data, used by real people to make real decisions.

Here is the exact playbook.

S0
Before the project

The decision-maker interview

Before any technical assessment, we sit down with the final decision-maker — the CEO, the Minister, the Secretary General — and ask a single question: "What is the decision you make most often with the least visibility?" Not "what KPIs do you want." The decision. Because a dashboard only has value if it changes a decision. If the CEO says "I never know whether my budget lines are on track before the end of the quarter," we know exactly what to build first.

S1
5 days · Free diagnostic

Data mapping — 5 days flat

We don't produce a 40-page audit. We map three things: where the data lives today (Excel, ERP, Access databases, paper), what its real quality is (we open the files, count blank cells, duplicates, inconsistent formats), and what the shortest path is between that data and the CEO's first decision. The deliverable: a clickable mockup of the future dashboard, built on real data.

A real example: at a European telecom operator, the CIO was convinced the main problem was the absence of a data warehouse. Week 1 revealed something else. The sales director received a 14-tab Excel file every Monday, consolidated by hand by an analyst. The file contained everything he needed — but it arrived Tuesday evening, 24 hours too late for Monday morning's meeting. The problem wasn't missing data, it was timing. The first dashboard automated exactly that file, refreshing at 7am on Monday. No data warehouse. No star schema. Just the right number, at the right time, for the right person.

S2–3
10 days · Data pipeline

From raw source to reliable model

We build the flow: source → cleansing → model → dashboard. Using Microsoft Fabric or Power BI directly, depending on complexity. The rule: if the dashboard figure is within 5% of the Excel figure the CFO produces manually, we move forward. This reconciliation test is the key: it's what builds the CEO's trust in the system. Until the CEO trusts the numbers, the dashboard is furniture.

S4–5
10 days · Daily iteration

Building with the business, not for the business

The dashboard is built in daily sprints with a business sponsor — not the CIO, someone from the department who will actually use the tool. Every evening, a version is published. Every morning, the sponsor says what works and what doesn't resonate. We iterate. We remove what's pretty but useless. We add what's missing. 3D charts disappear. 47 filters become 5. BI jargon titles become business language.

5 days · The verdict

The autonomy test

The business sponsor must, alone, without help, add a new visual to the dashboard, modify a filter, and explain a figure to their director. If they succeed, the first dashboard is officially alive. If they don't, we stay an extra week. The contract doesn't specify a number of days — it specifies this outcome. Our KPI is not delivery. It's autonomy.

The classic model vs the sprint: the numbers

Classic model

The 12-month project

9-14 months to first dashboard
150-300M FCFA typical budget
78 pages of specifications
12 dashboards delivered
2 actually used
high risk of abandonment at 12 months
NJIADATA Sprint

The dashboard in 6 weeks

6 weeks to first dashboard
15-40M FCFA first sprint
1 decision × data matrix (A3)
1-3 targeted dashboards
100% used (built with the business)
1 self-sufficient sponsor trained

What the sprint doesn't do — and why

The 6-week sprint doesn't cover everything. It doesn't produce a full data warehouse. It doesn't connect 15 data sources. It doesn't deploy an enterprise-wide governance model. And that's by design.

The sprint does one thing: it puts into production the first dashboard that changes a real decision. This first success creates three effects that make everything else possible. First, trust: the CEO has seen the system work, they believe in it. Second, appetite: other departments want their own dashboard. Third, funding: a CEO who has seen their first dashboard in 6 weeks is far more willing to fund phase 2 than one being asked to sign up for another 12 months based on a PowerPoint.

In 20 years of consulting, we've learned one thing: a BI project that doesn't produce visible value in the first 60 days will probably never produce value. The 6-week sprint is designed to force that deadline.

The anchoring protocol: what happens after week 6

The greatest risk for a dashboard is not technical failure — it's slow death. The dashboard works, but gradually the data stops being updated, the filters become irrelevant, and the CEO goes back to old habits. We've seen this scenario too often to let it happen again.

Our anchoring protocol has four concrete mechanisms.

×2
Trained pair
organisational resilience
J+1
Dashboard logbook
early dropout detection
4×/an
Quarterly check-up
continuous recalibration
M12
Vitality test
the real success KPI

The trained pair means two referents, not one. Because in any organization, the first person trained will be transferred, promoted, or leave within 12 months. If knowledge rests on a single head, the dashboard dies with their departure. Two referents is organizational resilience.

The dashboard logbook records who uses the dashboard, when, and for what decision. Not an administrative document — a monitoring tool that detects dropout before it becomes irreversible. If connection rates drop 30% over a month, that's a warning signal.

The quarterly check-up: half a day, 4 times a year, to recalibrate the dashboard as needs evolve. The CEO's questions change. The dashboard must change with them. This check-up is included in NJIADATA's engagement during the first year.

The vitality test at 12 months: one year after deployment, we measure three indicators — login frequency, number of data refreshes, number of modifications made by the sponsor. If all three are green, the project is alive. That's our real success KPI.

DASHBOARD VITALITY CURVE — 12 MONTHS POST-DEPLOYMENT CRITICAL ZONE ANCHORING ZONE 100% 80% 60% 40% 20% 0% M1 M2 M3 M4 M5 M6 M7 M8 M9 M10 M11 M12 Training ends Sponsor transferred Back to Excel Q1 check-up Q2 check-up + active pair ~13% ~96% Classic project (no anchoring) Sprint + NJIADATA anchoring protocol
Observed dashboard usage rate over 12 months post-deployment. Orders of magnitude based on our project experience.

Where this method comes from

We didn't invent the 6-week sprint in a lab. We forged it in the field, project after project, over 20 years in Europe. At telecom operators where marketing wanted a churn dashboard in 4 weeks, not 4 months. At industrial groups where the CFO wanted to see the impact of an acquisition on consolidated ratios before the next general assembly — not next quarter. At financial institutions where the regulator required J+1 reporting and "the project is in the scoping phase" was not an acceptable answer.

It's this European-grade expectation — deliver fast, deliver right, on complex organisations with real stakes — that shaped our method. And it's this same expectation we now bring to African organisations. Not to deliver a budget version. To transfer the best of what 90 years of combined experience has taught us.

The sprint is not a method for simple projects. It's a method for urgent projects. And in Africa, all BI projects are urgent — because the decision cannot wait 12 months.

Who this method works for — and when it's not enough

The 6-week sprint is not universal. It works when three conditions are met.

First, an engaged decision-maker. Not a token sponsor who signs the budget and disappears — a leader who commits 2 hours to the initial interview and reviews the dashboard every week during the sprint. This is the most demanding condition: if the CEO delegates to an intermediate project manager, we fall back into the data-first trap where specifications gradually drift from real needs.

Second, an available business sponsor. Someone from the user department (not IT) who can commit 2 hours per day during weeks 4 and 5 to iterate with us. This person ensures the dashboard speaks the language of the business. If no one is available, the risk is building a technical tool that the business never adopts.

Third, data that exists, even imperfect data. Disorganised Excel files, Access databases, partially populated ERP — we know how to work with "good enough." However, if data simply doesn't exist (no budget tracking, no customer database, no structured accounting), the sprint cannot create the raw material it's meant to transform. In that case, the first workstream is data structuring, not BI — and we say so before signing.

The sprint doesn't suit all scopes either. It produces 1 to 3 targeted dashboards, not a unified data warehouse or enterprise platform. It's an entry point — the first success that demonstrates value and justifies the next phase. Organizations that need a comprehensive data transformation from the start should consider a full engagement program rather than a sprint.

The sector doesn't change the method. What changes is the decision being targeted. For a ministry, it's budget execution. For a food industry group, it's margin by product and subsidiary. For an NGO, it's the disbursement rate by donor. The starting question is always the same: what decision will this dashboard change?

About NJIADATA

NJIADATA is a consulting firm specialising in Microsoft solutions for African markets, based in Paris and Abidjan. 4 senior founders, 90+ years of combined experience at European telecom operators, financial institutions, and industrial groups. Our mission: bring international-grade consulting standards to the continent while transferring skills to local teams.

From source to insight.

Ready for your first sprint?

The 5-day diagnostic is free with no commitment. At the end, you'll have a mockup of your future dashboard — and a clear vision of what BI can change in your organisation.

Request a free diagnostic →