Analysis of Operating Wind Farms workshop – Wind Europe (Bilbao, April 2016)

Mark Spring - Landscape - Colour - 200x181Lloyd's Register's Mark Spring, Senior Wind Turbine Specialist, outlines the key discussion points and explains why the challenge is to move from "big data" to "small data".

11 May 2016
by Mark Spring

The workshop was opened by Dave Vernooy, Product Line Manager at GE Energy, who announced that General Electric (GE) had started its transition from an industrial company to a software and analytics company. He pointed out that "big data" is great but you have to remember the people and the processing – it’s about making the results relevant and enabling people to implement changes which result in improvements in efficiency and, ultimately electricity production. The challenges are the volume of data, the velocity at which it is presented to the analytical tools and decision-makers, the variety of sources, ranging from qualitative information, experience, anecdotes through to automated condition-monitoring signals. The final challenge is to filter and interpret the data quickly and intelligently because of the variability of quality and reliability.

As frequently pointed out in the past, many speakers complained that there was too little collaboration between manufacturers and operators or owners of wind turbines. This makes the job of finding patterns in the data and proposing new composite health indicators very difficult. GE, Siemens and Vestas listed very big numbers of wind turbines installed around the world, huge totals of hours operated and electricity generated. Then at the end of the second session along came Li Shaowu of Lougyuan Power Group Corporation Limited with a portfolio of 11000 turbines in 160 wind farms, representing 90 different models. These have a capacity to generate over 15GW of electricity. He showed key failure mechanisms of gearboxes and blades and taught neural networks to "learn" the signs of imminent future failure, presenting the success of these models in terms of the prognostic horizon (how far in advance a failure may be predicted), the time to train the neural network and the time to make the forecasts.

There were a number of talks on icing - the effects on performance, safety and reliable data capture and the difficulty predicting when icing has or will have happened in practice.

As a way of tackling problems of unreliable data, successful applications were presented of using measurements from functioning systems in a neighbouring "golden turbine". Particularly nacelle-mounted anemometers may be iced up or measuring turbulent, wake-distorted air-flows. LiDAR systems mounted to turbine foundations, floating met-station buoys or offshore substations can supply meteorological data to a number of nearby turbines for blade pitching, forecasting production, predicted loads and scheduling maintenance.

In my view, the challenge is to move from "big data" to "small data" by developing models of dependable data, sensor health, turbine performance, component degradation, failure prediction which can be relied on so that only new patterns or changes in the data ecosystem need to elicit a response. Either the model can be modified or action can be taken. Lloyd's Register has an important part to play in this transition.

For more information, contact Mark Spring.

Related news

Related services

Training for oil and gas

Lloyd's Register delivers high quality, tailored training to the oil and gas industry within our ...

Marine training

We deliver tailored, relevant courses featuring real life scenarios – training that’s shaping bus...

Recent projects