The amount of data being collected in offshore production facilities has grown massively over the years to the point where ‘big data’ solutions need to be deployed in order to quickly and efficiently extract the important information from the mass of general data.
The offshore working environment is always challenging. There is a growing trend to collect data not just from the production platforms, but from the subsea pump and pipe network, and even ‘downhole’, far beneath the seabed. The physical challenges this presents are enormous.
And this is only one side of the coin. Once the data has been collected it needs to be converted into consumable information in a timely manner and communicated to its users. Significantly, there are often many users, each with different locations, requirements and objectives. Data communications between offshore and onshore is notoriously expensive, usually requiring high-latency satellite channels.
Offshore, engineers are looking at production rates, while managers are looking at resource management. Onshore, more people are looking at refining, sales issues, coordination with other production sites, etc. Then there are a myriad of agencies and regulatory bodies, each with their own agendas and data requirements.
For many years the industry has been trying to develop standard reporting procedures. While there has been progress in this direction, it is a moving target. The requirements are constantly changing and becoming more demanding. The nature of this particular beast is a many-headed hydra – there will always be multiple requirements, a lot of which will change frequently and may even appear contradictory to one another. Therefore, offshore oil and gas data systems will be forever developing and improving.
Looking at the situation from a practical perspective, perhaps the most obvious challenge is to provide all the end-users with relevant data quickly and in a format suitable to their needs. Another issue is that the volume of data is constantly increasing, while deadlines usually become tighter. Embedded computers located within offshore facilities can be quite powerful, making it possible to filter and simplify massive amounts of data before sending it to the onshore players.
Inevitably then, there comes a point when agility, speed and accuracy of the data system becomes crucial to effective decision-making. Data quality awareness needs to be developed within all personnel and a data quality management (DQM) system needs to be implemented.
Following on from this, clear procedures and protocols need to be established to manage data-related issues as they arise, including ones that quickly correct data errors.
Undoubtedly there is a need for the offshore production industries to automate as many of their data handling systems as possible, to use the massive power of modern computers and established big data techniques to the full. However, it is not solely about technical solutions: successful data and information management systems first need clear direction and ownership. This needs to be embedded into senior management thinking, even though much of the day-to-day work will be carried out at operational levels. Information requirements and standardised work practices need to be worked out between all stakeholders including production engineers, technical and commercial management, directors, regulators and appropriate governmental agencies.
Constantly redeveloping data management systems for offshore production is a big ask, but one that cannot be shirked.
Click here to find out how RaimaDB Embedded database helped a demanding offshore application.