Like-for-like comparisons
A Growth fund shouldn't be compared against a Conservative fund. That's not fair — and it produces misleading conclusions. We compare funds only against peers in the same risk group.
There are over 300 KiwiSaver options from around 30 providers. This page sets out how we compare them — what we assess, how we rate them, and the criteria a fund must meet to be on our recommended list.



Our methodology rests on three principles and four core checks. Every fund is compared only to its peers, across meaningful timeframes, on more than last year's return.
We sort multi-asset KiwiSaver funds into one of five risk-based categories, based on how their assets are split between growth and income. Every later comparison happens within these categories — so we never compare a Growth fund to a Conservative fund.
Funds that hold only one asset class — say, NZ shares only, or US bonds only — sit outside this framework. They carry concentrated risk that's rarely the right fit for a long-term KiwiSaver investor, so we don't put them on our recommended list.
Three principles shape every recommendation we make. They're deliberately simple — because fair comparison should be.
A Growth fund shouldn't be compared against a Conservative fund. That's not fair — and it produces misleading conclusions. We compare funds only against peers in the same risk group.
Past returns don't guarantee future returns — but they do show how a fund has performed across different market conditions. We focus on 5–10 year windows where available.
Returns alone aren't enough. We also assess fees, consistency, downside behaviour, and how the fund is run — so we recommend funds that are consistently better, not last year's winner.
A consistent set of checks across four areas. The goal is simple: recommend funds that have a strong performance track record, fair fees, and are well run.
We look at performance over longer periods — typically five to ten years where available — rather than the most recent quarter or year.
We look at returns after fees, across a range of periods.
A fund needs at least five years of returns before we will recommend it.
The aim is to avoid being swayed by a fund's latest hot streak.
Fees matter — they compound over decades. But the cheapest fund isn't always the best. We ask whether the fee is reasonable for what the fund is trying to deliver.
We look at the total annual fund charge — and compare it to what the fund actually delivered, after fees, versus peers.
Has this fund earned its fee, after fees, versus its peers?
We look at how bumpy the ride has been — and how the fund held up when markets turned rough. This includes past drawdowns and behaviour relative to peers in weak markets.
We assess the size of past drawdowns across real market events.
We compare how the fund behaves relative to its peer group when markets are weak.
A fund that looks strong in a bull market but falls harder than peers when markets turn isn't necessarily one worth holding long term.
The aim is to avoid surprises.
Numbers alone aren't enough. We also look at practical quality signals — the things that tell you whether a fund is well run.
Does the fund invest how they say? We compare its actual asset allocation against the target it states, and check whether marketing language (e.g. "passive") matches the underlying construction.
Is the fund of reasonable scale, with enough assets under management to operate efficiently?
Are the managers and strategy stable, or has there been significant turnover?
The goal is simple: recommend funds that have a strong performance track record, fair fees, and are well run.
The right balance changes with risk level. A Defensive investor cares more about a smooth ride than chasing returns; an Aggressive investor cares more about returns than smoothness. We reflect that by weighting the four checks differently for each category:
| Category | Performance | Consistency | Fees | Quality |
|---|---|---|---|---|
| Defensive | 20% | 50% | 20% | 10% |
| Conservative | 30% | 50% | 10% | 10% |
| Balanced | 40% | 40% | 10% | 10% |
| Growth | 50% | 30% | 10% | 10% |
| Aggressive | 60% | 20% | 10% | 10% |
Every fund we analyse is given an overall grade based on how it performs across the four core checks. Only around the top 10% of funds reach a grade of B or above and make our recommended list.
Stands out against peers across all four checks.
Performs well across all four checks.
Meets our bar across all four checks.
Publicly, non-recommended funds show as NR — no letter, no underlying detail. Recommended funds (A+, A, B) show their grade.
Signed in Create a free account to see the four-check pass/fail breakdown for every fund. C–F funds still show as NR with no letter grade — but you'll see exactly which checks they passed or failed.
It didn't score strongly enough against its peers in the same category.
There wasn't enough reliable data — for example, less than five years of returns.
It's a fund-of-fund whose underlying fund is already directly available in KiwiSaver from the same provider — there's no benefit going via the wrapper instead of holding the underlying directly.
It sits outside our category framework — a single-sector fund, or a fund whose allocation doesn't match one of our five categories.
Our primary data source is Morningstar, the global standard for fund analysis. Where there are gaps, we layer in public filings and go direct to providers.
BetterSaver may receive a commission from some KiwiSaver providers if you join or switch through our platform. This does not increase your fund fees.
In practice, there are more funds from providers we could be paid for that don't make our recommended list than there are on it. The list is driven by our criteria, not commercial arrangements.
Our analysis includes funds that pay us and funds that don't — as long as they meet our standards.
The questions that come up most often. Can't see yours? Talk to us.