How do you begin without getting it wrong?
You have arrived at the module that most institutions skip. They study. They deliberate. They form a working group. They wait for the right moment, the right budget, the right hire. And while they wait, the gap between where they are and where they need to be quietly grows wider.
This module is about how to move — not recklessly, but thoughtfully, in a way your team can follow.
The uncomfortable truth is that technology is rarely the reason projects fail. The tools, in most cases, work. What does not work is the context around them: unclear ownership, undertrained teams, unrealistic timelines, and change that was announced rather than built.[1]
In cultural institutions specifically, two failure patterns appear repeatedly:
At Artorythm, we use a five-step diagnostic framework called the 5C Methodology. It is designed to move an institution from "we know something is wrong" to "we know what to do about it" — forcing clarity at each step before moving forward.
What makes this useful is that most institutions already know their C1 — they can describe the current situation in vivid detail. What they struggle with is C3 and C4: the honest diagnosis of cause, and the concrete quantification of cost. Without those two, C5 tends to be vague and unfunded.
The 5C framework is the backbone of every Artorythm engagement. In Module 2, the Value Calculator addressed C4. In Module 4, the Digital Maturity Evaluation addressed C1 and C2. Now we bring it together into C5 — corrective action.
Not all problems are equally good starting points. A good first AI project is contained enough to complete in weeks, produces a result that is visible to the team, and is reversible if it doesn't work.
Ask these three questions about any candidate project:
When building the business case internally, frame the first project as a learning investment, not a transformation initiative. This removes the pressure of proving a large thesis — and sets realistic expectations for the board.
When proposing a pilot to your director, anchor it in a specific, named frustration — "the grant reports that take two days every quarter" — rather than an abstract argument for AI adoption. Concrete problems get concrete budgets.
Read the situation, then choose your approach. There is no single right answer — but some choices carry more risk than others. Select a card to see what typically happens.
Your institution has a cataloguing backlog of 3,000 items. You have one part-time archivist who spends roughly 60% of her time on manual data entry. The director has asked you to "do something about the backlog." You have a modest discretionary budget.
What approach do you take?
Research top vendors, demo three products, get board approval for a 12-month subscription, and begin migration.
The selection process takes 3–4 months. Migration takes another 2 months. The archivist spends weeks on data import rather than cataloguing. The backlog grows during the transition. The new system is powerful — but 18 months later, the original backlog problem is still not solved, because the bottleneck was always the process, not the tool.
Risk: High · Duration: Long · Reversibility: Low
Use an AI description tool on a sample of 100 items. Measure time saved per item. Decide whether to scale.
The archivist tests the tool for one week and finds it cuts her data-entry time by roughly half on standard items. Complex items still need full manual work. In four weeks, you have a real number — "this approach could clear 40% of the backlog in the next six months" — which you can take to the director with confidence.
Risk: Low · Duration: Short · Reversibility: High ✓ Recommended starting point
Bring in short-term help to reduce the backlog manually while you plan the longer-term solution.
The backlog shrinks while the assistant is there. When the contract ends, the backlog begins to grow again. You have not changed the underlying process — you have borrowed time. This is a reasonable interim measure, but it is not a solution. The same backlog problem will return within 12–18 months.
Risk: Medium · Duration: Short-term relief only · Consider combining with Option B
The market for "AI tools for cultural institutions" is growing fast. Some products are excellent. Many are not. Before committing budget or time to any vendor, ask these five questions directly — the quality of their answers tells you a great deal.
Change in cultural institutions is rarely blocked by logic. Most staff understand that their institution needs to evolve. What creates resistance is something subtler: the fear of becoming less valuable, the discomfort of not knowing what "good" looks like in the new process, and the feeling that their expertise isn't being respected.[2]
Three things tend to reduce this resistance:
Building a business case for change? Frame it around mission impact, not efficiency metrics alone. Boards respond to "this frees the team to do more of the work the institution was founded to do" more than "this saves 3.5 hours per week."
Proposing change upward? Give your director a reversible, bounded ask. Not "we should adopt AI" but "can I test one tool on one task for four weeks and report back?" Small asks get faster yeses.
The Artorythm free pilot is a structured 4-week engagement — one workflow, one team, one clear output. We document what we find in a Workflow Insight Report you keep regardless of what you decide next.
Download a sample Workflow Insight Report →We work with a small number of institutions each quarter. The pilot is free. The only requirement is that someone from your team is available to work with us for four weeks.
Apply now — limited places available →You now have the framework to diagnose clearly, choose wisely, and start without waiting for perfect conditions.
[1] McKinsey & Company — Unlocking success in digital transformations, 2018
[2] Prosci — Best Practices in Change Management, 2023
[3] Collections Trust — Spectrum 5.1: The UK Museum Collections Management Standard
Links
Stay in touch
© 2026 Artorythm