Well if it isn’t a great time to be a Mac admin…

Especially when there are a plethora of tools available to manage the operating system. Open source technologies mark their presence here as well, contributing many tools to help make the OS management easier for admins. Here are a few tools worth taking note of.

Application Deployment Tools


An app that can be defined basically as an ecosystem, Munki is used to deploy imported applications to clients. Though its logic is generally run on the client software on each machine, it deploys app in a very refined manner. It runs as Managed Software Center.app.

The repository of metadata and apps can also be stored on any web server regardless of the platform. With Munki, it’s also possible to selectively deploy applications allowing users to either go for ‘optional install’ or ‘force install by date’ to install packages.

Other features include:

  • Acting as a portal for internal apps
  • Side panel links
  • Customizable headers

It can do so much more, but it can’t all be covered in a single article.


A great mimic of Apple Server’s Software Update Caching Service, Reposada is a set of Python tools ideal for caching software updates and deploying them internally, saving a lot of bandwidth in the process. The tool also allows users to release updates to a test group first before deploying them.

Client Management


Operating system instrumentation framework or Osquery is Facebook’s contribution, and is capable of exposing the OS as a relational database according to GitHub. This essentially allows users to write SQL-based queries to explore the data in the OS. The SQL tables can represent open network connections, file hashes, loaded kernel modules etc.

Google Santa

Still in beta, Google’s Santa is an effective tool to whitelist/blacklist items on a macOS system. It keeps track of the binary executions in the OS, checks binaries against a defined database, and then determines whether to block or allow the binary.

The tool operates in two modes:

  • Monitor mode – The default mode that allows all binaries to run except for the ones that were ‘marked’ to be blocked.
  • Lockdown mode – The mode that allows the execution of only the whitelisted binaries.

Santa is a centrally managed tool, and as such can help prevent malware from spreading across a number of machines.


It really is a great time to be a Mac admin.

Written by: Ajeesh Azhakesan

Many enterprises seem to approve of the benefits of open source technologies in running their businesses so much that they have already adopted them. However, there are enterprises that are still hesitant to use open source.

Though workers and managers don’t mind deploying the technology wherever possible in IT environments, it’s still a hard sell with management, partly because that department is particularly resistant to change. Why would they want to fix something that isn’t broken? They don’t want to make changes by bringing in open source technologies, while everything’s already functioning properly.

Another reason could be the fact that open source software are indeed open – neither bought nor sold. Then there is a general perception that the “free” in free doesn’t necessarily mean “good”.

Ironically, open source software does indeed provide a lot of benefits to enterprises that use them properly. Here are a few reasons that explain how an enterprise benefits from using open source technology.

  • Cost – Back in the days, vendors of proprietary software spent a lot of money to convince enterprises that proprietary software are more beneficial, easy to maintain, and less expensive to deploy. The relatively new (then) open source technology couldn’t compete with that.

Times have changed however. Now it doesn’t take a big effort to prove that open source is indeed everything it claims to be and more; a better option compared to expensive proprietary software.

Open source software is generally free, and the users need only pay for support. Open source vendors generally charge only a fraction of what proprietary vendors charge for product support. Present day open source software also comes with adaptive capabilities to overcome challenges that arise when new applications are deployed.

  • Versatility – Not all proprietary software are versatile. Salespersons may say otherwise before getting an enterprise to sign the contract. Proprietary software are basically just off-the-shelf solutions designed to be effective only in a limited set of use-case scenarios. Yes, most vendors offer a 30 day free trial. But it might take longer than 30 days for an enterprise to realize that the software needs a specific feature to tackle a particularly redundant challenge. Contacting the vendor won’t help when it comes to proprietary software. They probably wouldn’t understand why the enterprise would want such a feature.

When it comes to open source software, the enterprise can take more time to completely evaluate it. Because the enterprise gets access to the source code of the software, they will also be able to figure out the areas where the software lacks a necessary feature that can meet the firm’s needs, and work it out the way they see fit.

  • Scalability – Open source products for enterprises can scale to large proportions. The release of Kubernetes enhanced this feature allowing enterprises to scale up whenever the demand rises, and down when the demand drops. Though this feature mostly helps large enterprises, even small companies benefit from it. They won’t have to rely on other platforms for scaling, and can do so without hassle when they hit paydirt.
  • Security – The security aspect of open source has always been a subject of debate. Because everyone gets access to the source code, people with malicious intent can locate and exploit the vulnerabilities in the code, which can spell disaster for enterprises. But the open source community begs to differ. They seem to believe that because a large number of developers and security experts get access to the source code, security vulnerabilities will be identified sooner.

Once a security vulnerability is identified, they will immediately start working on a patch. In practice, open source software are the most vulnerable in systems that aren’t properly configured or patched.



Proprietary software don’t get released like they used to nowadays, which is why experts think open source is the future. Although the trend is only picking up pace, there is still a question of quality. Open source software for enterprises aren’t a set of codes developed by kids for their college projects. Google started the Kubernetes project, and NASA had a role in the development of OpenStack. Big players have already started favoring open source, which makes this the right time for enterprises to adopt the technology and future-proof themselves.

Written by: Ajeesh Azhakesan

Internet of Things is growing faster than initially expected, transforming businesses along the way. IoT essentially opens the doors to many possibilities for businesses like innovative ways to connect with customers, converting data into forms suitable for decision-making, and even creating new avenues for enhancing customers’ experience. What’s actually quite surprising is the influence of open source technologies on IoT.

Analyzing the Scope of IoT

IoT isn’t actually something now. It’s been there for a long time – Military, Space Research etc. But once it crossed over to impact the consumer sector, IoT gained huge popularity. The fundamental goal of IoT is to change the way people interact with the world. It’s not just data that keep increasing. The scale of devices around us is huge. Pretty much everything generates data now i.e. data have become ambient.

IoT now influences many sectors including medical sector, manufacturing, industrial automation, and security. But when you consider its scope, it becomes quite apparent that IoT is still far from achieving its true potential. In the case of enterprises, the main hindrance is the challenges they face. This is where open source comes in. So far, open source managed to provide progress for the technology allowing it to become the disruptive trend it is now.

Why Open Source is Right for IoT

One of the most dominant platforms in IoT is Linux. You can find a lot of IoT devices running open source operating systems. Data centers serving IoT devices also prefer open source as it ensures interoperability and essentially future proofs the ecosystem.

Experts predict that enterprises will be investing considerably more on IoT this year, though there are still concerns about interoperability. According to a report by McKinsey Global Institute, interoperability is vital for the progress of IoT adoption in enterprises. The total potential economic impact of IoT, as cited by the report, is estimated to be between $3.9 trillion and $11.1 trillion a year by 2025. Interoperability certainly makes a significant difference in the estimate.

Considering the importance of interoperability, we can safely say that open source is a must-have for sustaining the growth of IoT market. Majority of the IoT platforms today make use of the cloud on software-driven architectures. Open source already dominates the cloud emphasizing its importance once more in the IoT ecosystem though the system comes in both open and closed standards at present. So as to meet the needs of a wide array of consumers and stakeholders, interoperability between the systems involved is inevitable. This further adds to the value of choosing open standards.

Apart from this, many businesses are questioning the choice of relying on proprietary platforms to establish IoT, as those platforms are likely to disappear in the near future or might become too sophisticated to interconnect. Many surveys already claim that most of the respondents are betting on open source platforms for their IoT projects.

What it Means for Developers

The number of IoT developers also keeps increasing every year, and the count is estimated to cross 4 million by 2020. A good number of IoT developers also favor open source. For these developers, enterprise IoT proves to be more challenging and complicated than consumer IoT, as there will be more sensors involved in an enterprise application. Making applications customizable is a challenge for the developers as well.

Open source IoT and its advancements allow them to overcome most of these challenges and create solutions that promise immersive experience for customers. The enterprise software developers are provided with more opportunities to effectively design and develop enterprise IoT applications, essentially contributing to the overall prolific growth of the Internet of Things.

Written by: Ajeesh Azhakesan

Containers play a vital role in software development for getting the software to run reliably in various computing environments; from a developer’s computer to a test environment, and staging and production. In addition to this, container technology is mostly adopted at present by companies to achieve developer efficiency.

When it comes to container technology, Docker is still the most popular choice. However, Kubernetes isn’t that far behind either. The open source project continues to be a reliable choice for container orchestration and management. The most recent version of the container cluster manager includes a plethora of beneficial features that will help teams optimize container usage in organizations.

Brian Gracely, Red Hat’s Director of Product Strategy, says the present state of containers has hit a tipping point, prompting the need for businesses to be faster and more responsive as to what’s going on in the market. Companies end up relying on individuals to keep the systems secure, update, and scale them appropriately. Kubernetes aims to automate this according to Gracely.

He also added that after leveraging out of big global web-scale cloud providers, Kubernetes has become applicable to lots of vertical industries and businesses. Companies can either apply Kubernetes to a new project or find a way to add it to their portfolios.

Key features of Kubernetes 1.5

According to the Kubernetes team, version 1.5 will benefit those who want to run a distributed database on Kubernetes. It will include solutions that could help guarantee application disruption service-level objectives (SLO) for stateful and stateless applications. Kubernetes 1.5 is also unique because PodDisruptionBudget and StatefulSet moved into beta.

The Kubernetes team mentioned that these features will make it easier to deploy and scale stateful applications, and perform cluster operations like node upgrade without violating application disruption service-level objectives.

Basically, with StatefulSet, organizations can use applications in a Kubernetes environment and run them in containers, while customers will have a consistent platform without compromise in functionality. In addition, users will not be forced to rewrite applications while using containers.

The PodDisruptionBudget beta is simply an API object that’s meant to specify the minimum percentage (or number) of replicas of a collection of pods that must be up. It basically allows the application deployer to make sure that the cluster operations don’t voluntarily evict too many pods simultaneously so as to cause data loss or service degradation.

Another promising feature is ‘Federation’ which essentially allows the user to pair an individual Kubernetes environment with another (or multiple) Kubernetes environment making them appear as a single pool of resources. This allows organizations to enable their data centers to make use of additional public cloud resources, also allowing them to explore environments beyond their data centers while having control and visibility over what those environments should look like.

Limitations of Kubernetes 1.5

Certain features including StatefulSet are still in beta, though development is progressing rapidly. The company also doesn’t want to trouble users by releasing upgrades every three months, and is currently figuring out a reliable upgrade process.

Written by: Ajeesh Azhakesan

Creating, packaging and deploying a software are three stages that are critical to a release management lifecycle. Although there are a number of reliable containers enterprise developers can rely on for this purpose, one stands out from the rest for its “Build, Ship and Run” vision, ending up as one of the most popular open source technologies today – Docker.

With Docker, creating, deploying and running applications are considerably easier. This benefits developers and system administrators more, and is the main reason why it’s close to finding a place in a DevOps environment.

DevOps is something that covers the entire delivery pipeline.

According to Wikipedia:

DevOps is a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.

Because of this, it’s obvious that it could have sets of multiple tools to fulfil its purpose. This article intends to explore how Docker fits in to a DevOps ecosystem, and how the ecosystem benefits from it.

DevOps teams can breathe easy

With the rapid growth of Docker into a reliable environment for software development, software companies, having acknowledged its potential, have started using Docker in DevOps environments more. The DevOps teams seem to be benefitting the most, as they can now efficiently configure both development and test environments thanks to Docker. This in turn results in successful release management.

Before the inception of Docker…

Back when Docker was just an idea, developers, testers and the operations team had to rely on complex tools for configuration management. The complex integrations and the issues that may arise out of it can further complicate things.

There are a lot of environments involved and they should all be aligned to make it work. That in itself takes a lot of work. But with Docker, the team just starts with a base image that stays the same in other environments from development to testing.

After the inception of Docker…

A lot happened. But let’s talk about where it fits in DevOps. Docker in DevOps is both a platform and a tool.

For developers, it’s a platform where they can run the applications.

For operations people, it’s a tool that facilitates integration with the workflow.

In the end, they will be able to work with and deploy the same code. Normally, after the developer is done with the development and testing, the operations people will be tasked with deploying it. And if in case an issue arises which didn’t during the development phase, the operations people will lose their sleep.

With Docker, there would be no friction when the ops team prepares to deploy the application after development and testing. It will be seamless.


Being open source has its perks. For Docker, it means you can have additional out-of-the-box features, and support from a big community. And unlike a virtual machine, there is no hypervisor in Docker, which is why it’s considered lightweight. This is also why launching a Docker container is very fast. Build, test, run and deploy on the go at an impressive pace.

For a DevOps ecosystem, Docker-based pipeline reduces the risks of software delivery, and at the same time cuts the cost. It ensures timely delivery, and a satisfied developers and operations staff.

Written by: Prashant Thomas
Page 3 of 3123