All about the SHARP Crash Helmet Safety Scheme

The SHARP crash helmet testing scheme was introduced in 2007 to try and link real-world accident research data to a repeatable scientific and lab-based approach to crash helmet safety testing and output their results in a simple format that we, the buying public, can understand. And while there may be detractors to the ‘real world’ nature of the SHARP test (there always will be) it was hoped those results will allow us to make an informed choice as to the safety of our new helmets.

Before the SHARP test, we had no idea whether a £50 Banzai helmet will resist the smearing of your memories down Brompton High Street better or worse than a £1000 Arai or Shoei. Now, at least, we can have a comparative view between the two and can make our buying decision fully furnished with the facts.

Of course, all crash helmets have to be approved at their point of manufacture. They have to pass a mandatory ECE 22.05 testing scheme to be able to sell any crash helmet in Europe (DOT in the US). But the guys at SHARP devised a cunning scheme to uniformily test the impact-resistance, the shock absorption levels (i.e. the amount of shock passed to the head and brain) and the frictional and rotational properties of different crash helmets to allow us to make an informed choice.

The idea was to supplement the ECE test with safety testing based on recommendations made by – undoubtedly the most influential and comprehensive look at the role played motorcycle crash helmets on injury and fatalities – the COST 327 Study. This study, carried out across 9 Eu countries and 14 Eu organisations, analysed, then recreated in the lab, real accidents from a range of motorcycle accidents admitted to three hospitals (including Glasgow General). They analysed brain and head injuries, road accident data and helmet damage. This was then used to create a series of tests used by SHARP to assess the effectiveness of motorbike crash helmets.

So what do they do?

First, they only test helmets they’ve bought from shops themselves. Which sounds like a good start, so there’s no potential manufacturer interference. They buy helmets in various sizes (M, L & XL) to make sure the results reflect a range of sizes, and run 32 impact and oblique impact tests on them.

They test helmets at 3 different speeds (low, med & fast) and impact the helmets against a flat and angled surface to try and give an indication of how they’d perform against real-world flat and pointy surfaces – like kerbs, armcos and badly-driven BMWs.

They also run the ‘oblique tests’ to test friction performance for when you impact a surface at an angle to assess how much rotational force will be transmitted through to the rider’s brain (in 60% of casualties, rotational forces are known to be a major source of brain injury from an accident).

They then, in their words, compare their results data against ‘real world injury data’ to arrive at their own SHARP rating out of 5 stars with, for the avoidance of doubt, 5 being the best. And that’s pretty much it.

Here’s SHARPs own video showing the testing process.

ECE 22.05, SHARP, DOT and Snell testing

We often see some of the biggest names in motorcycle crash helmets – and I’m thinking of LS2 and Schuberth here – and until recently Arai – not score too well in SHARP tests.

In Arai’s case, they historically scored an average of around 3 stars in the SHARP test. That’s changed recently with their last 3 SHARP tested helmets scoring either 4 or 5 stars, so they must’ve done something to address the problem.

From what I’ve seen, I can only think it’s probably because some makers focus more on passing both the US Snell and US DOT tests – both of which have a penetration test. That means a projectile is hurled at the helmet and the helmet has to stop it piercing the helmet to pass the test. And that means manufacturers who want to make sure they pass this test make their helmet shells harder.

Which is different to how SHARP sees the world.

Because the real-world accident data in the Cost 327 study points to accidents where helmets were penetrated being a tiny, tiny number – most testing authorities agree there’s no real point in including a penetration test in their testing.

However Snell and DOT continue to include a penetration test so helmet manufacturers have to manufacture their helmets to pass the test.

Why’s this an issue?

Well, when a crash helmet takes an impact, it’s not just the shock absorbing EPS liner that absorbs the energy of the impact (and thereby protecting your head/brain). The helmet shell plays a big role too.

When absorbing the impact, the helmet shell absorbs energy by flexing. That absorbs a load of the energy leaving the polystyrene liner with less work to do. So if you make a crash helmet with a really strong, inflexible shell, then all the work has to be done by the liner and it gets little help from the helmet shell.

Hence it’s possible Arai, LS2, Schuberth and others make harder, less flexible helmets (designed to pass a penetration test) and so find they score less well when someone like SHARP comes along and measures how much energy is passed through to the head in an impact.

So there you have it.

There are always going to be proponents and opponents (and every ponent in between) for any testing regime, but considering the number of manufacturers, the variety of technologies and the range of weird and wonderful impacts we bikers dream up to subject crash helmets to, the SHARP test seems to be one of the best attempts towards a serious and comprehensive real-world helmet safety test. At the very least, it offers a way to compare and contrast the impact resistance – and some abrasion resistance – between helmet brands when buying a new helmet and, considering what went before, it seems to be a bloody good effort.

That’s why we use the SHARP test results on our articles where possible. It’s not perfect but it’s the best we’ve got and gives a way to see comparative safety levels between helmets.

Since writing this article, we were invited along to the SHARP testing labs in Greater Manchester and you can read all about their testing and lots of other nuggets of safety information here.

For more reading please check out our short article on expensive v cheap helmets or our mahoosive layman’s guide of the SHARP crash helmet data.


  1. Hi, I bought an LS2 FF351-1, with a BMS Gold sticker(1800419.)however it only has a sharp 1 star rating, do you think I was mislead at the point of sale, regards Jon w

    • Depends entirely on what the shop claimed about the helmet when you bought it. It’s an ECE rated helmet so totally road legal – just happened to score poorly on the non-mandatory SHARP test. But that’s a separate issue from whether you were mislead when buying the helmet.

  2. Hi,

    I don’t know if you noticed but SHARP rating seems to be inconsistent in their star rating. Example:

    First helmet – Arai Quantum has green back and top impact zones, and black left and right. Helmet got 3 stars
    Second helmet – Harley Davidson Laguna II has green back and top impact zones, and red left and right. Helmet got 1 star.

    Why helmet with better impact protection has less stars than one with worse?

    • Good spot and you’re right, it does look inconsistent. I guess it’s possible they’ve just entered it into the site incorrectly but I’d expect the data underpinning the star rating is correct (I know the SHARP guys are majorly detail-oriented when it comes to their data!). But I’ll try and raise it with the SHARP guys next time I bump into them and see what they think!

      • Thanks Billy.

        I actually revisited this page reading your Arai QV-Pro review. Appreciate the article.

        Another example:
        Arai QV – Pro is 5 stars – both side impact zones being yellow (rest green).

        Older Arai Quantum ST is 4 stars with 1 side yellow (rest green).

        HJC IS-MAX2 4 stars – both sides orange (rest green).

        Shuberth C4 3 stars both – sides orange (rest green).

        Shuberth C4 Pro 3 stars – both sides red, back orange, top back yellow, top front green

        I am quite sure that I could find other examples. This is one of the reasons I do not trust and cannot recommend SHARP rating to anyone. I really hope that they will improve their approach in the future.

        • Thanks again Marcin. I’ve had initial contact from the guys at SHARP and they say they’ll get back to me once they’ve investigated. I’ll update you here when I hear any news.

        • Hi Marcin, the guys at SHARP finally got back to me. The reason for the discrepencies is that the colour coding on the diagram represent just a snapshot of the tests at one of the three impact speeds – 8.5m/s – and against a flat surface. They’re there to give a visual ‘idea’ of how well the helmet will perform but don’t actually represent the entirety of the test data – something that only the overall SHARP star rating represents. So they’ve published that diagram to give an at-a-glance view of the energy absorption capabilities of the helmet taken from that 8.5m/s test.

          Why 8.5m/s? That speed represents a 30% increase in energy above the 7.5m/s used in ECE 22.05 testing that the original COST327 European motorcycle accident study suggested would lead to a 50% reduction in fatalities.

          The actual overall SHARP star rating uses the full range of test data from all the tests, then takes that data and weights it ‘according to the best motorcycle accident data available’ to produce the final rating.

          Hence the diagram and the final star rating can have differences. There’s actually all the information we require on the SHARP website on this page. Having said all that, whether that colour coded diagram is actually more misleading than enlightening is a different question – but hopefully that’s given you the answer to your question!



Please enter your comment!
Please enter your name here