Containers play a vital role in software development for getting the software to run reliably in various computing environments; from a developer’s computer to a test environment, and staging and production. In addition to this, container technology is mostly adopted at present by companies to achieve developer efficiency.

When it comes to container technology, Docker is still the most popular choice. However, Kubernetes isn’t that far behind either. The open source project continues to be a reliable choice for container orchestration and management. The most recent version of the container cluster manager includes a plethora of beneficial features that will help teams optimize container usage in organizations.

Brian Gracely, Red Hat’s Director of Product Strategy, says the present state of containers has hit a tipping point, prompting the need for businesses to be faster and more responsive as to what’s going on in the market. Companies end up relying on individuals to keep the systems secure, update, and scale them appropriately. Kubernetes aims to automate this according to Gracely.

He also added that after leveraging out of big global web-scale cloud providers, Kubernetes has become applicable to lots of vertical industries and businesses. Companies can either apply Kubernetes to a new project or find a way to add it to their portfolios.

Key features of Kubernetes 1.5

According to the Kubernetes team, version 1.5 will benefit those who want to run a distributed database on Kubernetes. It will include solutions that could help guarantee application disruption service-level objectives (SLO) for stateful and stateless applications. Kubernetes 1.5 is also unique because PodDisruptionBudget and StatefulSet moved into beta.

The Kubernetes team mentioned that these features will make it easier to deploy and scale stateful applications, and perform cluster operations like node upgrade without violating application disruption service-level objectives.

Basically, with StatefulSet, organizations can use applications in a Kubernetes environment and run them in containers, while customers will have a consistent platform without compromise in functionality. In addition, users will not be forced to rewrite applications while using containers.

The PodDisruptionBudget beta is simply an API object that’s meant to specify the minimum percentage (or number) of replicas of a collection of pods that must be up. It basically allows the application deployer to make sure that the cluster operations don’t voluntarily evict too many pods simultaneously so as to cause data loss or service degradation.

Another promising feature is ‘Federation’ which essentially allows the user to pair an individual Kubernetes environment with another (or multiple) Kubernetes environment making them appear as a single pool of resources. This allows organizations to enable their data centers to make use of additional public cloud resources, also allowing them to explore environments beyond their data centers while having control and visibility over what those environments should look like.

Limitations of Kubernetes 1.5

Certain features including StatefulSet are still in beta, though development is progressing rapidly. The company also doesn’t want to trouble users by releasing upgrades every three months, and is currently figuring out a reliable upgrade process.

Written by: Ajeesh Azhakesan

Kentico, an esteemed content management and digital experience platform provider, recently released the latest version of its popular ASP.NET CMS platform, Kentico 10, which is more than meets the eye. Karol Jarkovsky, Director of Product at Kentico, talked about the issues the new Kentico 10 can deal with, and what the future holds for the product, over a Skype interview with Laura Myers, a tech reporter of CMS-Connected.

The company is known for the research it does prior to developing and releasing products. According to the Director of Product, their research saw frustrated marketers, content creators, and developers as they are swept over by the sheer complexity of various CMS tools they have to use, while not getting the advertised benefits of those tools.

This was stated as one of the reasons they developed the latest Kentico version as an intuitive tool that solves aforementioned issue by properly aligning both marketers and developers, while allowing them to get their job done faster.

Pain points

Karol also explained three primary issues or pain points Kentico noticed in the market during its research, which were also considered while they developed a rooted solution in the form of Kentico 10.

  • Many organizations want to increase productivity but lack the necessary technology to do just that
  • CMS lacks the capability to deliver enough context for businesses to achieve their goals
  • CMS either doesn’t scale as needed or demands a price to do so


First pain point – Improving productivity

He further stated how the new Kentico addresses each pain point, and emphasized the importance of improving productivity. He explained with an example of how content authors spend about 10-15 minutes searching for the right image, and then spend more time optimizing it for various channels and layouts. Kentico 10 offers a solution for this in the form of responsive images management, which enables users to specifically predefine image variants. The system will only need a command from the user on the variants they want to be created, and the image will be automatically optimized accordingly before upload. This improves productivity, addressing the first pain point.

Second pain point – Context

Regarding context, Kentico pointed out that marketers struggle the most with three main issues.

  • Converting prospects into customers
  • Increasing existing customer value
  • Attaining an ROI through marketing

The success of marketing campaigns directly influence the chances of achieving these goals. But Kentico identified a workaround which involves tweaking the marketing campaigns. Marketers learn and improve with failures. However, the tools available don’t let marketers tweak the campaigns they are already running based on their new findings. They will have to conclude the campaign, and start a new one from scratch.

As a solution to this, Kentico added a campaign management tool in Kentico 10 which enables marketers to tweak running campaigns on the go, rather than ending it and starting all over again while losing data in the process. Karol also mentioned that they added something similar to an analytics tool to the Kentico 10 in the form of ‘campaign journey tracking’. This basically gives marketers an analysis of the running campaign, enabling them to identify the point in the campaign that got the highest visitor drop outs. This indicates the points of failure and the steps in the campaign that can be improved. Though the solution addresses the context pain point, it also aids in productivity enhancement.

Third pain point – Scalability

Data sets in organizations today are larger than ever. 53% of the data in these organizations are estimated to be left unanalyzed. 45% of the organizations surveyed were unable to analyze data across disparate sources, and 42% do not feel they can analyze the data enough to gain beneficial insights.

The data will only keep growing. Consider Internet of Things (IoT) as well, and we will have over 20 billion ‘connected devices’ in the world by 2020. This means there could be a great increase in data volume and data fragmentation. A tool that scales itself along with growing data can make a significant difference. Keeping scalability in mind, Kentico made sure that the latest release of its CMS tool is the most scalable version to date.

The Future of Kentico

Karol also spoke about the future of Kentico in the interview, and what they plan to do now that Kentico 10 is out. He revealed that they have already started working on Kentico 11 which will be featured as an improved ecommerce solution that focusses on integrations. As organizations generally go for reputed, standalone ecommerce solutions, Kentico plans to provide integrations with those popular B2B/B2C ecommerce solutions.

This would be beneficial particularly to organizations in the mid-market as they wouldn’t miss a stable, effective CMS solution that matches their ecommerce functionalities well. The Director of Product also revealed that Kentico plans to work with ecommerce market leaders to develop optimal solutions that deliver the best of both CMS and ecommerce. He added that they will be improving data visualization, and the marketing dashboards as well.

As Kentico’s Bronze Partner, Verbat looks forward to more innovative CMS solutions from Kentico that caters to the needs of the market and help organizations meet or exceed the high level of customer expectations.

Written by: Ratheesh V S

The right kind of project management methodology can do wonders in software development. However, when it comes to choosing one, people would want a lot of questions answered.

What is this Agile methodology we’ve been hearing about?

I’ve heard it’s quite similar to Lean methodology. Is it?

Does Agile actually mean Scrum?

Agile and Lean have been around for a very long time. But choosing one of two has always been under debate. There is a relationship between the two which is often misunderstood.

This is just a simple analysis of what each means, and how they are connected. But first, their history…


Originally derived from ‘Lean Manufacturing’.

Lean Manufacturing, according to Wikipedia, is a method devised for the elimination of waste within a manufacturing system.

The Lean methodology, from a software project management perspective, is basically a set of principles that will help achieve speed, quality and customer satisfaction.

You may have heard something similar about Agile development methodology as well. Therein lies the source of confusion.


Back in the days, there were some hardcore methodologies in software development. They were popular all right, but kind of defeated their own purpose. How?

You see, these methodologies started stultifying software projects, preventing them from providing the result they were meant to. Software projects were meant to create software that helped the customer. With those methodologies, things went sideways. At that time, the Agile Manifesto was formulated as a reaction….or better yet – a solution.

So Agile basically refers to the principles proposed in the Manifesto.

The confusion

A woman named Mary Poppendieck who had worked in a manufacturing plant, worked with her husband Tom Poppendieck, a software developer, to figure out a way to adopt some manufacturing principles in the software realm. That intellectual concoction, ladies and gentlemen, is what we call Lean.

Interestingly, Mary was also the founding board member of the Agile Alliance, that introduced the Agile concept. Strong lean-influenced ideas obviously had a key role in the origin of the Agile Manifesto.

Lean and Agile

History lesson is over. Time to focus on the topic at hand.

So the ideas from Lean manufacturing has been applied to software development. This kind of defines the way Agile works. Hence the similarity. Both Lean and Agile is based on a combination of customer-centric approach and adaptive planning. As a matter of fact, Lean manufacturing did influence Agile enthusiasts or Agilists as they are commonly called.

Lean is about eliminating waste right? So in the software realm, the idea is to eliminate anything that doesn’t have value, and work on only that which is absolutely necessary. That ‘anything’ could include documentations, meetings, certain tasks etc. Inefficient ways of working will also be eliminated, thereby delivering results faster.

Here’s what one can derive from Mary Poppendieck’s words.

Lean = Rapid Response + Quality + Discipline + Speed-to-market

Agile, on its own, focusses on short iterations. A functional software will be delivered in the end after thorough testing, scrums, sprints and a lot of other facets.

The Poppendiecks’ approach was to blend Lean with Agile. This Lean-Agile combo will include the iterations of Agile development, and the validation practices of Lean. They are deeply entwined when applied to a software development environment. One isn’t exactly an alternative to the other. So, in a nutshell, if you are using Agile, you are using Lean as well…and vice versa.

The point of interest is where you decide to use the ideas from Lean manufacturing in your Agile methodology. So there isn’t actually a victor in this face-off.

To conclude, there isn’t a big difference between two except at the core. Lean software development focusses on elimination of waste so as to improve the processes, while Agile software development methodologies adhere to the principles in the Agile Manifesto (eg: Scrum).

With that said, a good Agile development team will adopt the best technical and management practices (which will include the principles of Lean as well) that work best for them, and leave their customers satisfied in the end.

Written by: Prashant Thomas

Creating, packaging and deploying a software are three stages that are critical to a release management lifecycle. Although there are a number of reliable containers enterprise developers can rely on for this purpose, one stands out from the rest for its “Build, Ship and Run” vision, ending up as one of the most popular open source technologies today – Docker.

With Docker, creating, deploying and running applications are considerably easier. This benefits developers and system administrators more, and is the main reason why it’s close to finding a place in a DevOps environment.

DevOps is something that covers the entire delivery pipeline.

According to Wikipedia:

DevOps is a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.

Because of this, it’s obvious that it could have sets of multiple tools to fulfil its purpose. This article intends to explore how Docker fits in to a DevOps ecosystem, and how the ecosystem benefits from it.

DevOps teams can breathe easy

With the rapid growth of Docker into a reliable environment for software development, software companies, having acknowledged its potential, have started using Docker in DevOps environments more. The DevOps teams seem to be benefitting the most, as they can now efficiently configure both development and test environments thanks to Docker. This in turn results in successful release management.

Before the inception of Docker…

Back when Docker was just an idea, developers, testers and the operations team had to rely on complex tools for configuration management. The complex integrations and the issues that may arise out of it can further complicate things.

There are a lot of environments involved and they should all be aligned to make it work. That in itself takes a lot of work. But with Docker, the team just starts with a base image that stays the same in other environments from development to testing.

After the inception of Docker…

A lot happened. But let’s talk about where it fits in DevOps. Docker in DevOps is both a platform and a tool.

For developers, it’s a platform where they can run the applications.

For operations people, it’s a tool that facilitates integration with the workflow.

In the end, they will be able to work with and deploy the same code. Normally, after the developer is done with the development and testing, the operations people will be tasked with deploying it. And if in case an issue arises which didn’t during the development phase, the operations people will lose their sleep.

With Docker, there would be no friction when the ops team prepares to deploy the application after development and testing. It will be seamless.


Being open source has its perks. For Docker, it means you can have additional out-of-the-box features, and support from a big community. And unlike a virtual machine, there is no hypervisor in Docker, which is why it’s considered lightweight. This is also why launching a Docker container is very fast. Build, test, run and deploy on the go at an impressive pace.

For a DevOps ecosystem, Docker-based pipeline reduces the risks of software delivery, and at the same time cuts the cost. It ensures timely delivery, and a satisfied developers and operations staff.

Written by: Prashant Thomas

There has been a lot of discussion going on in the internet on Service Oriented Architecture (SOA) and Microservices lately. There were a lot of debates as to what makes them different from one another, and which one is better among the two. There were many valid arguments from both sides.

While some consider Microservices as the future of architectural style, many others still prefer SOA. Let’s get an idea on our contenders so we can come to a conclusion.


Considered to be the modern day go-to architectural style for developing highly scalable applications, microservices addresses quite a lot of problems associated with large, cumbersome applications. It’s a service-based architecture with independently deployable services as the primary components.

It provides better control throughout the development, testing and implementation cycles, but has a limited service taxonomy when you consider service type classifications. It also makes use of an inter-service communication protocol (REST, JSON etc.).


SOA can defined in many different ways because this architectural style has been constantly evolving over the years. It was designed to bring order to sophisticated combinations of enterprise-level software by representing them as collections of services. SOA also uses service communication protocols. It can be considered as a superset of microservices.

It relies on a shared data model. The model will have complex relationships between numerous data structures and models, and multiple hierarchies. The tiered organizational structure of SOA facilitates service coordination and messaging functionalities.

Now that you have a basic idea, let’s get ready to rumble. For starters, let’s make this a three rounds bout.

Round 1: Services Decoupling – SOA is based on a shared data model. Therefore, you can expect it to have tight data coupling between services and other system components. This makes it quite resistant to changes. Some additional re-testing might be necessary in some instances to make sure that changes haven’t negatively affected any service.

Microservices architecture runs with a concept referred to as bounded context, which promotes an association between a single service and its data. It isn’t possible to completely eliminate sharing of services but it can be considerably minimized. Whenever sharing is required, it’s avoided by replicating common functions across services instead of using data sharing. Though this data decoupling facilitates deployments more often, it also cuts down testing scope.

Round 2: Messaging Middleware of SOA vs Microservices’ API Layer – SOA’s multi-tier model features a central messaging middleware layer. As for microservices, there is a non-coordinating API layer over the services that constitute an application.

Messaging Layer key points:

  • Additional capabilities including message transformation, mediation, and routing.
  • Elevated data and functional coupling degree
  • Increased complexity
  • Increased deployment and maintenance costs

API Layer key points:

  • Simpler than messaging layer of SOA
  • Easy to change granularity of services
  • Easy to change internal data representations
  • No modification required of the requesting application (for both changes)

Round 3: Coordination of Services – With a central hub controller, SOA maintains order in the execution of services. Microservices use inter-service communication protocols for the same.

A microservice can call another microservice whenever necessary so as to complete its function. The service that’s been called can call other services as well (a process called service chaining). Too much chaining isn’t advised and should be avoided as it indicates a degree of functional coupling with no benefits at all.

Conclusion and Verdict

Both subjects in question appeared good in the 3 rounds. However, one cannot replace the other as there are many other variables that indicate further distinctions between microservices and SOA.

While SOA can address a set of heterogeneous applications in sophisticated enterprise systems facilitating shared services across applications and functions, microservices is an optimal approach for web-based, smaller, less-complex applications. These applications do not require explicit service coordination.

Because of granularity and high independency of services, the microservices’ model finds a place with continuous deployment models of software development.

The verdict from the above arguments would slightly tilt in favor of microservices. But both of them can be highly effective depending on the context where they are used. Microservices approach can deliver agile applications that can necessarily transform into an SOA-styled architecture.

SOA, on the other hand, can apply microservices principles for better statistics in maintenance and performance.

Conclusion? Both are effective depending on the working environments. It’s a tie.

Written by: Prashant Thomas
Page 11 of 11« First...891011