Services ERP & Integrated Management Data & Business Intelligence DMS & Collaboration Markets Public Institutions International Organizations Enterprises International Groups Explore Approach References About Academy Insights Demos
Free diagnostic →
Methodology Guide February 22, 2026 20 min read

From level 2 to level 3 the concrete steps of a first BI deployment

60% of BI projects fail to deliver business value. Not because of the technology — because of the method. Here are the steps that make the difference, adapted to African ground realities.

JD

NJIADATA Team

Paris · Abidjan

The CFO needs the budget execution rate. He emails the budget office. Who contacts three departments. Who each compile an Excel file. Who send it back in 5, 8, sometimes 12 days. The CFO finally receives a number — that he already knows is outdated.

This scenario repeats every month in hundreds of organisations across Sub-Saharan Africa. This is level 2 on our BI maturity framework: Excel running the show, reporting as a chore, decisions made in the dark.

The transition to level 3 — a first operational dashboard, the same number available in 3 minutes instead of 12 days — is the most critical. It is also where the failure rate is highest.

According to Gartner, 70 to 80% of BI projects fail. Dataversity estimates in 2025 that 60% fail to deliver business value, despite over $15 billion spent annually on BI tools worldwide. And the problem almost never comes from the technology.

What follows is a synthesis of documented best practices — Microsoft's Fabric Adoption Roadmap, academic literature, published deployment feedback — contextualised for African realities and informed by what we have observed in digital transformation projects across Europe and Africa.

1 Start with the question, not the tool

The most documented first mistake is also the most common: installing a BI tool, then figuring out what to do with it.

What the field shows

In a European financial services group, an ambitious BI project had deployed over 50 KPIs on an expensive platform. Two weeks after launch, adoption was below 10%. Executives kept requesting their numbers by email. The problem: nobody had asked them which question they wanted answered first. The tool was answering questions nobody was asking.

The right sequence is the reverse. It starts with a simple question, asked to the leader who makes the decisions:

The founding question

What is the first piece of information you need, that you cannot get quickly today?

The answer might be "our budget execution rate to date," "the number of cases pending for more than 30 days," or "our receivables collection rate."

The specific answer matters less than the fact that it is concrete, measurable, and comes from the decision-maker — not from IT.

Measure the Time to Decision

Once this question is identified, measure how long it takes today to answer it. This is the Time to Decision. If the answer takes 3 days, the measurable project objective is to bring it down to 3 minutes.

A concrete goal, understandable by everyone, and verifiable. Not "improve decision-making" — but "reduce the response time for this specific question from 3 days to 3 minutes."

Why this approach works

Academic research on BI critical success factors converges on one point: organisational factors weigh more heavily than technical factors. Villamarín-Garcia and Pinzón (2017) identified 13 success factors — and the top ones are all organisational: clear vision, defined use case, alignment with business objectives.

Starting with the leader's question guarantees that alignment from day 1.

2 Find the sponsor — or don't start

If there is one absolute consensus in the BI literature, it is this: without an executive sponsor, the project will fail.

70-80%of BI projects fail (Gartner)
#1failure factor: lack of sponsorship (PMI, Microsoft)
55%of users lack confidence due to insufficient training

The PMI (2013) explicitly identifies the absence of sponsorship as a fundamental reason for project failure. Microsoft is equally direct: the sponsor must have cross-functional authority.

Not a signatory — a user

The sponsor is not someone who approves the budget and disappears. It is someone who will personally use the dashboard. Who will reference it in meetings. Who will ask "what does the dashboard say?" when a topic is debated.

What the field shows

During the deployment of an ERP system in a multi-site industrial group in Europe, the designated sponsor was the IT director. He had the technical expertise — but not the authority over business units. Result: each department resisted in its own way, data was not delivered on time, and the project fell 8 months behind schedule. The day the CEO took over sponsorship in person, resistance ceased within weeks.

Microsoft describes two approaches. The top-down approach — a senior leader with formal authority — has the highest probability of success. The bottom-up approach can work, but sooner or later hits obstacles beyond the champion's level of authority.

In Sub-Saharan Africa, this distinction is even more pronounced. Hierarchical structures are often more vertical than in Europe. A project driven by the Secretary General faces no wall compared to one driven by an IT department head.

What can go wrong

The sponsor says yes but never opens the dashboard. They approve the budget then delegate follow-up to a deputy. Six months later, the project is technically delivered but nobody uses it. The real test: does the sponsor open the dashboard at least once a week? If not, the project is at risk — even if it is "complete."

3 Scope: one process, one dashboard, one visible result

Microsoft uses the term "lighthouse domain." The idea: start with a single business area, demonstrate value there, then expand.

This is the opposite of what most ambitious BI projects do. Dataversity documents that 57% of BI implementations exceed their budget and timeline due to lack of scope definition. What follows is always the same: scope explodes, the sponsor loses patience, the project dies.

One process, not ten

Three criteria. The data already exists — even in Excel. The need is recognised by the sponsor. And the result is visible in weeks, not months.

Recommended first BI scopes by organisation type
OrganisationFirst scopeKey question
MinistryBudget execution"What is our credit consumption rate to date?"
EmbassyConsular tracking"Applications processed this month, average delay?"
Bank / microfinancePrudential ratios"Where do our solvency ratios stand in real time?"
CompanySales tracking"Revenue by product/region vs target?"
NGOProgramme monitoring"Activity execution rate by zone?"

Each of these scopes can produce a first operational dashboard in weeks — a tool the sponsor actually uses daily.

What can go wrong

The first scope is chosen, but the source data is dirtier than expected. Excel files contain duplicates, inconsistent formats, missing columns. The temptation is to launch a major cleanup effort. Don't. Clean only the data needed for the first dashboard — the strict minimum. Exhaustive cleanup comes with maturity, not before.

Nothing to show at month 12? The project is dead

Microsoft's Fabric Adoption Roadmap recommends focusing on the current quarter. A first visible result builds credibility, unlocks resistance, and justifies the next investment. An 18-month plan that shows nothing before month 12 is dead before it starts.

49.7%of ERP projects meet their timeline (Panorama Consulting)
15.5months — actual median ERP project duration

BI is not ERP — but the lesson is the same: the longer a project runs without a visible result, the higher the failure risk.

4 Architecture: designed for the terrain, not the lab

This is where the African context imposes choices that standard deployment guides do not cover.

Cloud, hybrid, or on-premise?

Two variables: the reliability of internet connectivity where users will access dashboards, and the sensitivity of the data involved.

For users primarily in a well-connected capital (Abidjan, Dakar, Nairobi), a cloud solution such as Power BI Service or Microsoft Fabric allows deploying a first dashboard with no local server infrastructure — the only requirement being a stable internet connection. For decentralised offices in rural areas, degraded mode must be planned from the start: PDF-exportable dashboards, deferred synchronisation, pre-loaded data on tablets.

What the field shows

During the deployment of a reporting solution in a multi-site group, the dashboards designed at headquarters worked perfectly — fibre connection, large screens, real-time data. At decentralised sites, reality was different: intermittent connectivity, older workstations, untrained users. The same dashboards were unusable. The solution required a complete interface redesign for remote sites — offline mode, simplified visualisations, automated exports. This work would have cost half as much if it had been planned from the start.

This is not a backup plan. It is an architecture constraint to integrate from day 1.

Three rules, not fifty pages

Gartner predicts that by 2027, 80% of data governance initiatives will fail. But the opposite trap also exists: trying to govern everything before starting means never starting.

Microsoft proposes "managed self-service" — empowerment within a governed framework. For a first deployment, three rules suffice.

Identify who is responsible for each source data set. Not a 50-page governance scheme — just a name next to each file feeding the dashboard.

Define a refresh frequency. The dashboard must clearly display whether the data is from yesterday or last month. A number without a date is a dangerous number.

Control who sees what. "Security by design" does not require a complex system — but sensitive data (financial, HR, strategic) must be protected from version one.

Perfect governance is the enemy of useful governance. Three simple rules — identified owner, displayed frequency, controlled access — are enough to start.

5 Skills transfer: the survival condition

What the field shows

In a financial institution, the service provider had trained a single internal person — the designated "data referent." Four months after delivery, this referent was transferred to another department. Nobody knew how to modify queries, fix a data source, or even refresh the indicators. Within six weeks, the organisation was back to Excel files. The dashboard still existed technically — but nobody opened it.

This scenario is not anecdotal — we have seen it repeat across very different contexts, in Europe and in Africa alike. It is not a risk. It is the default outcome if skills transfer is not structured from day one.

Not a two-day training — an apprenticeship

It is not a two-day classroom training. The literature on ERP and BI deployments converges: generic "BI 101" workshops do not work. What works is learning embedded in real work.

First, identify at least two internal people as "data referents." Not necessarily technical profiles — people who are curious, rigorous, and likely to stay in their roles. Two rather than one, so the competence survives a transfer.

Second, these referents do not watch the service provider work — they work alongside them. Every step in tandem: data connection, modelling, dashboard creation, production deployment. The provider does, explains, then lets the referent do.

Third, the project is not complete until the referents can modify the dashboard on their own — add an indicator, fix a data source, create a new visual. This is the exit test. Until it is passed, the service provider has not finished their job.

Creators, not just consumers

Microsoft uses the number of active creators — not just consumers — as an adoption metric. That is the right measure: a level 3 organisation has people who consult dashboards; a level 4 organisation has people who build them.

Skills transfer is what makes that transition possible. And it is what breaks the cycle of dependency on external providers — a cycle particularly costly for African organisations that, as the EY 2025 report highlights, already face a structural shortage of data skills.

The real measure of a service provider's success

After 6 months, can the organisation continue without them? If yes, they succeeded. If no, they created dependency — exactly the model African organisations must avoid.

Conclusion: method before technology

FROM LEVEL 2 TO LEVEL 3 — REALISTIC TIMELINE 1 W1 2 W2-3 3 W4-6 4 W7-8 5 M3+ THE QUESTION Identify the leader's question. Measure Time to Decision. DIAGNOSTIC Sponsor identified. Scope defined. Source data audited. 1st DASHBOARD Built in tandem: provider + referents. The sponsor uses it. TRANSFER Referents autonomous. Exit test: modify dashboard alone. AUTONOMY Expansion. New scopes. Level 3 reached. ≈ 8 weeks From diagnostic to first dashboard in production Ongoing Building autonomy Not 18 months. Not a massive budget. One leader, one question, one dashboard, eight weeks.

Let us recap the path from level 2 to level 3.

A question — the leader's question, not IT's.

The Time to Decision measured.

A sponsor who will personally use the result.

A single scope where data already exists.

Architecture designed for the real terrain, not the lab.

Skills transfer embedded from day 1, not the final week.

None of these steps requires a massive budget. None takes 18 months.

What fails in 60 to 80% of BI projects is not the tool. It is the absence of method. And the mistakes are always the same — we have seen them in projects of all sizes, in Europe and in Africa alike: a tool chosen before the question, an absent sponsor, a scope too broad, an architecture disconnected from the terrain, a skills transfer relegated to the final week.

Start small, prove value, expand — that is the sequence Microsoft recommends, academic literature validates, and the field confirms.

The hardest part remains the same as at the end of our first article: it is not the technology, it is the decision to begin.

Sources and methodology

This article synthesises documented best practices from the following sources, contextualised for Sub-Saharan Africa and informed by our founders' experience in digital transformation projects across Europe and Africa.

Frameworks: Microsoft — Fabric Adoption Roadmap (learn.microsoft.com, 2024-2025) · Microsoft — Power BI Implementation Planning.

Failure rates: Gartner — 70-80% failure · Dresner Advisory — Wisdom of Crowds (59%) · PMI — Pulse of the Profession (2013) · Dataversity — Why 60% of BI Initiatives Fail (Nov. 2025) · Panorama Consulting — 49.7% ERP timeline compliance.

Academic: Yeoh & Koronios (2010) · Villamarín-Garcia & Pinzón (2017) — 13 factors · Pham et al. (2016).

African context: data from our overview article · EY — Weak Ecosystem Cohesion (Nov. 2025).

Field observations: the vignettes presented in this article draw on situations encountered by NJIADATA's founders during their respective careers in IT consulting and digital transformation. Names and identifying details have been removed.

Share

Your BI Project Deserves to Succeed

Tell us about your context. We will tell you where to start and how quickly you will see the first results.

Speak with a consultant →