The recent DBmaestro Database DevOps Survey reveals a wealth of fascinating insights into the work of database professionals today.
DBmaestro contacted almost 1000 DevOps and database professionals around the world, seeking their views on the current state of database DevOps in their organisations.
Integrate Platforms
Respondents were nearly unanimous in their view that the database should be an integral part of DevOps pipelines – 97% of them agreed. However, most organisations are having a hard time turning that vision into reality: only 23% of respondents have managed to fully integrate the database into their DevOps pipelines.
In our experience we see how different organisations with different kinds of databases bring different challenges when integrating them into a CI/CD pipeline. Even the most modern database systems require careful management of changes to ensure integrity, security, and performance, and integrating the valuable data held in legacy systems can be a real challenge when there is a skills gap between the system administrators and the DevOps engineers. There’s no one size fits all solution, and it requires people with skills across not only DevOps, but also the technical skills with the databases to get right. It’s difficult for organisations to corral all the required skills while developing new systems and keeping the business running.
Accelerate Innovation
Releases are getting quicker. The days of one or two releases a year seem to be behind us – only 5% of respondents are releasing a few times a year. Almost half (44%) now release between once a quarter and once a month, and 42% are releasing even more often than that – between once a month and a few times per week. A handful of respondents – less than 5% – are reaching a frequency of multiple releases every day.
Modernising the testing and deployment processes for systems is paramount to achieving the agility that modern day businesses need to succeed, and if the database is not a part of those processes, then they are holding the entire organisation back. Organisations that embrace DevOps for Data benefit not only from the increased development velocity that comes from more efficient use of their resources and infrastructure, but with better collaboration and feedback throughout the teams what is delivered better meets the business needs. It’s doing things better… but also doing better things.
Regardless of how fast they’re working now, database pros all want to work faster. Most respondents are aiming to accelerate their release cycles compared to where they are today.
Reduce Costs
When it comes to allocating their time, DBAs still spend most of their day servicing manual changes. Nearly half of DBAs spend up to half of their working hours on manual database releases, and 35% spend over 75% this way. And 3% of DBAs spend their entire working day on nothing but manual releases.
Skilled database administrators are not cheap, but without the processes in place to automate repetitive, mundane manual tasks their time and skills are not being used effectively. With the database integrated into the development and delivery pipeline database administrators are free to concentrate on adding value by enhancing the system design and performance – it’s better for the business, and a more rewarding role for them.
Increase Quality
Why do database errors occur? Different teams have different views on this question. Among DBAs themselves, undocumented hotfixes are regarded as the number-one culprit. However, IT leaders point the finger at partial updates, while developers and DevOps professionals argue that conflicts between teams are the foot cause of errors.
There is no single cause of errors that make it into the database, if there were it would be easier to target and fix, but these findings really highlight the multitude of benefits that DevOps for Data provides for increasing the quality of database related deliverables. The use of automated quality analysis tools to identify issues earlier in the development process, along with defined release processes with quality and approval gates help to prevent poor quality changes making it to the production systems – but DevOps is about people as much as process, and the increased communication and co-operation between both technical and business teams reduces the likelihood of poor quality changes in the first place, and where it is decided changes need to be made there is much less friction between teams who are already constantly collaborating to deliver the best product possible.
Find out how Triton Consulting can help your organisation apply DevOps practices to data management. Visit DevOps for Data or get in touch.
You can also download the full DBmaestro survey here