Best Practice in the IT world
I really don’t like the term “Best Practice”, because that implies that it can’t get any better, that’s not true as it’s the constant changing, modifying and improving of standards, methods, techniques and processes that makes best practice. This is especially true in the IT world.
I would like to enter this topic with a non-IT related example. A few months ago, I was sick; this illness caused some internal imbalance that eventually lead to a high build-up of wax in my one ear. It was so bad, my ear was completely blocked, and it remained blocked for days. I had to go to the doctor.
I was expecting the doctor to use some special ear-scope to find the wax build-up and then pluck it out with a tweezer or any other thin long medical device, it seems like the obvious solution, it would be quick and effective.
Instead, the doctor poured warm, soapy water into my ear, she poured and poured into the ear for about 1 and a half hours until the wax started to soften and eventually wash away.
I asked her why she used this approach, isn’t there faster methods available, she said that there was, but this is the way she was taught.
I left doctors office realising that doctors learn “best practices” right at the start of their careers, it’s part of their core training. They also don’t compromise on their best practices, no matter how tight the deadline or what the client demands. It’s either best practice or not all.
The IT world is not like that. Firstly, we are ruled by deadlines and client demands, but apart from that, we start off our career without knowing any best practices.
Let’s take programmers for example, Programmers don’t learn best practices approaches when it comes to programing, most learn how to programme, they learn a technology like ASP.NET and SQL, and then apply what they know to achieve the desired objective. For a junior programmer, it’s not about doing it right, it’s about getting it done.
Maybe it’s unfair for me to generalise and say that all programmers don’t learn best practices. You do get those would be programmers that studied in a higher learning institution for a few years, giving them a good, all rounded skill set including good standards, understanding of methodologies (like waterfall) and approaches in programming. But in IT, technology is constantly changing; the tools available are frequently updating, human behaviour is shifting. This means that whatever approach you defined 6 months ago may already be out of date, and should be re-visited.
Also, in IT, there is no one size fit all when it comes to best practices. Whatever you have learnt in your higher learning institute or previous job may not apply in your current working environment.
So how do we define best practices in the IT world?
Using IT focused methodologies is a great start. The Rapid methodologies (replaces the waterfall approach) is a great way to start (but there may be something newer out there by the time you read this article), and this is a great starting point when defining good standards and approaches, but understand that this methodology will most likely need to be refined to fit into your environment, this is when the project managers needs to get involved.
Good Project Managers have received some sort of training (be it formal or informal) on how to do things right and have learnt approaches and techniques to drive project success, this includes communication strategies and risk management (amongst others). Applying that knowledge into a project approach and combining it with a proven methodology with Rapid will result in you making a good strong move towards best practices.
After all that, come constant improvements. Review sessions during a project require commitment that does not necessary lead to current project success (so is often ignored) but it can lead to future project success. Documenting where the team has done right or wrong and planning approaches (for example: building check lists, implementing a governance strategy, form a QA team, provide training) to avoid the pitfalls identified is really the “best practice” approach for best practices.
I would like to enter this topic with a non-IT related example. A few months ago, I was sick; this illness caused some internal imbalance that eventually lead to a high build-up of wax in my one ear. It was so bad, my ear was completely blocked, and it remained blocked for days. I had to go to the doctor.
I was expecting the doctor to use some special ear-scope to find the wax build-up and then pluck it out with a tweezer or any other thin long medical device, it seems like the obvious solution, it would be quick and effective.
Instead, the doctor poured warm, soapy water into my ear, she poured and poured into the ear for about 1 and a half hours until the wax started to soften and eventually wash away.
I asked her why she used this approach, isn’t there faster methods available, she said that there was, but this is the way she was taught.
I left doctors office realising that doctors learn “best practices” right at the start of their careers, it’s part of their core training. They also don’t compromise on their best practices, no matter how tight the deadline or what the client demands. It’s either best practice or not all.
The IT world is not like that. Firstly, we are ruled by deadlines and client demands, but apart from that, we start off our career without knowing any best practices.
Let’s take programmers for example, Programmers don’t learn best practices approaches when it comes to programing, most learn how to programme, they learn a technology like ASP.NET and SQL, and then apply what they know to achieve the desired objective. For a junior programmer, it’s not about doing it right, it’s about getting it done.
Maybe it’s unfair for me to generalise and say that all programmers don’t learn best practices. You do get those would be programmers that studied in a higher learning institution for a few years, giving them a good, all rounded skill set including good standards, understanding of methodologies (like waterfall) and approaches in programming. But in IT, technology is constantly changing; the tools available are frequently updating, human behaviour is shifting. This means that whatever approach you defined 6 months ago may already be out of date, and should be re-visited.
Also, in IT, there is no one size fit all when it comes to best practices. Whatever you have learnt in your higher learning institute or previous job may not apply in your current working environment.
So how do we define best practices in the IT world?
Using IT focused methodologies is a great start. The Rapid methodologies (replaces the waterfall approach) is a great way to start (but there may be something newer out there by the time you read this article), and this is a great starting point when defining good standards and approaches, but understand that this methodology will most likely need to be refined to fit into your environment, this is when the project managers needs to get involved.
Good Project Managers have received some sort of training (be it formal or informal) on how to do things right and have learnt approaches and techniques to drive project success, this includes communication strategies and risk management (amongst others). Applying that knowledge into a project approach and combining it with a proven methodology with Rapid will result in you making a good strong move towards best practices.
After all that, come constant improvements. Review sessions during a project require commitment that does not necessary lead to current project success (so is often ignored) but it can lead to future project success. Documenting where the team has done right or wrong and planning approaches (for example: building check lists, implementing a governance strategy, form a QA team, provide training) to avoid the pitfalls identified is really the “best practice” approach for best practices.
Comments