With many of the tools commonly used in a Continuous Delivery pipeline, Windows is not the original OS the tool was developed for. Although support and adoption are growing all the time, there can still be some pain points and gotchas in configuring some of them to work as you would expect on a Windows OS.
In this post, we’re going to combine two of the big hitters in this space
Recently, I have been fortunate enough to undertake some Azure training under one of the Solution Architects at Microsoft. We spent a lot of time focusing on Azure Service Fabric, Microsoft’s platform for developing microservice-based solutions. Much of what I heard reminded me of my experiences of Docker Swarm and Kubernetes, two very popular container orchestration platforms, and many of my questions in the training sessions were centered around why I might want to use Service Fabric over one of these other platforms that I knew a little better.
One point the trainer made that has stuck with me since is: containers often contain microservices, but microservices are not containers. It was clear to me then
At the day job, one of my team’s current projects is a bespoke “serverless” script execution service for internal use, not unlike AWS Lambda or similar offerings. I’m not the main guy on this, but I’ve been involved in some interesting discussions about how we should control the execution environments. Ideally, they would be sandboxed and completely disposable, possibly only alive for the lifetime of the script they are executing. The obvious solution to this is to use containers.
The dominant scripting language amongst our user base is PowerShell, so we need to try
This past Tuesday, the annual State of DevOps Report for 2017 was released. The report is one of the most respected in the industry and attempts to measure and comment on the direction that a broad range of organizations travel on their DevOps journey. There are usually some interesting comparisons and trends to analyze against the previous year’s results also.
The report is
When I was a junior developer, the sort of books and blogs I read off-the-clock tended to be about design patterns, abstractions, test-driven development (TDD) and so on. These topics haven’t gone away, but these days we tend to focus more on rapid application delivery, continuous integration, feedback loops and continuous improvement in the development process of that software, not necessarily the software itself.
Much of the classic developer reading material has strong craftsman-like overtones, the excellent Pragmatic Programmer being the most obvious example. On the other hand, some of the more recent material, like Continuous Delivery or The DevOps Handbook, describe the process of developing software as a factory production line to be optimized, inspired by the lean manufacturing revolution at the likes of Toyota in the 1980s.
Is the change in imagery from the master artisan to the fine-tuned production line a sign of progress or something to be worried about? It looks like the software industry’s Industrial Revolution might have started.
A match made in heaven
If you are a regular reader you will know just how much I have fallen for Golang recently. If not, see Fun with WebSockets in Golang for why I think it’s such a great language for writing backend services.
As explained in that blog post, my motivation for learning Golang originated with my experimentation with Docker. Golang programs are (usually) statically compiled to machine code, not bytecode, so no runtime interpreter like a JVM or Python is required to run them. This means that you can fit those programs into the smallest Docker containers possible for maximum density and reduced attack surface. Pair that with Golang’s performance (which is comparable to C++) and you have a match made in heaven.
Today’s post isn’t exactly automation-related, but I’ve been having a lot of fun learning Golang over the last week or two and felt the need to share some of the things that I really like about the language and what I think it’s strengths are.
Unit testing and the reason for mocking
If you’ve read my definitions of unit testing and system testing in my Ten Tenets of Test Automation, you’ll know that the primary differences between the two are target audience and granularity of coverage. Unit tests seek to validate the intentions of developers at a very fine, narrow scope, often at the function or small class level.
There should be lots of these fine-grained checks to validate your code base. Having them helps you understand the impact of evolving requirements. Having to deal with change in the life of developing a product is as certain as
You won’t need me to tell you that Docker has been a dominating force in automated infrastructure for the last couple of years. For the uninitiated, a container is an isolated, lightweight execution context for an app/service (and its dependencies) that share a kernel with other containers. Because an app can be delivered in its deployed state and run consistently by any Docker host, using containers greatly reduces the scope for environmental issues, e.g. incompatible versions of libraries on the host machine, interfering external processes, etc. Containers are the biggest advancement in application delivery since the birth of server virtualization, so they’re worth learning how to use.
Back in February, I was given the chance to deliver a presentation for the BCS, the chartering body for IT and computing in the UK, on the evolution of the software development lifecycle as we race into the Cloud era.
Well, I say that. I was originally approached to do a talk about test automation, but as I was thinking about what I might be able to add to that arena it occurred to me that the testing phase of the classical SDLC gets far more coverage from an automation sense than any other. Much of the modern thinking on how to deliver software efficiently automates much more of the process than just the testing. I began researching how the most progressive teams used automation to drive some of the lesser covered phases and a talk on how automation technologies are taking these over became much more compelling to me. Hopefully, the audience agreed!