Microsoft’s Program Manager Mads Torgersen recently posted about what they plan to do with their widely popular .NET languages – C#, Visual Basic, and F#. And things look uncertain for Visual Basic.

Their plans were later elaborated in detail through a series of blogs. There will be fundamental changes that looks favorable to C# and F#. Though there wasn’t any negative statements regarding VB’s future, the vibe their statements left made it seem that a bumpy road awaits VB.

Where they stand now…

According to Torgersen, C# is the most popular of the three, and will keep evolving as a state-of-the-art programming language. He added that F# is steadily climbing the ladder, and could become the best-tooled functional language on the market. VB is apparently the odd one here, as it is now mostly used for Windows Forms applications and in ASP.NET development by new developers. But many surveys indicate that developers are moving from VB to C# owing to the latter’s richer ecosystem.

Microsoft’s reason for changing its development strategy could be the fact that both C# and VB have different audiences. They found that new developers in Visual Studio favor VB, as it’s more comfortable them to learn the tricks of the trade. VB is still a tool for client-server programming while C# evolved to being useful for n-tier web-based applications.

F#, on the other hand, takes a different, more independent path. Microsoft will reportedly take measures to make F# more useful in the .NET landscape, hopefully with better error messages and support.

Where they are going

C# and VB will be going different ways from now on. VB’s popularity is dwindling and it will eventually end up disappearing from the radar for good. Evidently, C# is diverging from VB. As VB will be supporting the standard base class libraries set along .NET framework in the not too distant future, we can expect some amount of cross-platform work in the dying language. However, not all VB codes would be portable. Some can be shifted to a smaller set of libraries while the rest remains on on-premises applications. F# will find its place among financial services and machine learning-based applications.

What’s in it for enterprise developers

For starters, they will have two important choices.

  • Move VB codes to newer platforms
  • Start using C#

The second choice gives them access to a plethora of target frameworks, and across many devices. C# can also directly address the .NET Core platform via APIs, not to mention other platforms. This means, Unity and other derivatives of C# will be capable of supporting their own APIs.

Choosing one of the two will require the developers to think about where they want to be in a few years. Only a subset of .NET APIs will be made available to them if they choose to go with VB development. They need to consider the evolution of .NET platform as a whole, and whether they plan to work on cloud or mobile technologies. C# seems to be a safe bet from the looks of things though VB will still be around for a little while longer.

One other alternative is to stick with both VB and C#, as the .NET framework makes it easy to mix them up. The transition is as easy as that between VB.NET and VBA. But this could be just a temporary advantage. From what we could surmise from Microsoft’s new programming language strategy, C# would prove to be the better path in the long run. This would likely bring about more changes in the software development industry this year.

Written by: Ratheesh V S

When you purchase a product and realize that it doesn’t deliver what it had promised, who would you blame? The manufacturers or the people who were responsible for making sure the product does what it’s supposed to do?

The point is, every product should be tested before release. That goes double for software. Software testing makes sure the software was developed to meet the expectations and to make sure that the product won’t fail to deliver. QA is never complete without testing. It also helps the developer identify, locate, and fix bugs related to coding, environment, and configurability. In rare cases, testing also opens doors to adding a few bonus features to the product without hassle.

The QA Ecosystem

If there’s one thing that’s as important as the development phase in a software development life cycle, it is quality assurance primarily because it aids stakeholders in understanding the product’s selling point, ROI etc. The results from the QA team will also help the marketing determine a sound strategy for the product.

The key here is the number of tests performed on the product. Performing more number of tests will help identify more bugs. The more bugs eliminated, higher the product quality. However, in many development companies, testing period is cut short to meet the delivery and deployment deadline or to get a competitive edge in the market. Though this might do more harm than good, there is a fact that it’s practically impossible to test a software end-to-end along with regression testing in a short time.

To conclude, manually performing all those tests will take a lot of time unless the company can afford a large number of testers. The budget subsequently increases with the number of testers, which is not an option for many development companies.

This is where testing automation comes in.

Testing Automation

You can automate every test on the product from sanity to regression and performance. Automated testing takes a short amount of time, effort, and resources. As human errors won’t be present, accuracy is pretty much guaranteed. These are the main reasons why companies started adopting testing automation policies to deliver high quality products at the right time.

This practice led to one baffling question…

Will manual QA/Testing get completely replaced by automated testing?

From what all the evidence and facts point to, the conclusion still doesn’t provide a solid answer. To a certain extent, automated testing will somewhat replace manual testing. But not completely though as manual testing is the best approach in certain situations like:

  • If the product is unstable or riddled with issues that might hinder automated testing in some way
  • When the automation framework and its scripts aren’t stable
  • If the product is at an early stage of development, and might likely get design changes
  • When the budget doesn’t cover automation, and experienced testers (or ones who are familiar with the type of product) are available

But mind you that automation isn’t quite easy to implement, and takes a good effort. But this is just an initial trouble. Once automation is implemented, you will be seeing rapid results, with many additional benefits, which brings us to the various advantages of automating QA.

Advantages of Automation in QA


  • Role in regression testing – There are instances where the testers need to run the same test or test the same thing over and over again. This takes too much that the testers could have used in testing other sections. Obviously, automation can make a significant difference here, saving a lot of time.
  • Stress/Load/Performance testing – Manually performing these tests would compromise accuracy. Automation is a hero here as well.
  • CLI and GUI Testing – Automation can overcome the impractical limitations associated with testing various aspects of CLI and GUI.
  • Reusability – It is possible to ‘reuse’ previously conducted tests on different versions of the software without having to write scripts every time.
  • Reliability – Automation scripts perform tests systematically based on pre-set standards. The same operation can be performed by these tests, while eliminating human errors.
  • Comprehensiveness – Every feature of the product can be tested after building feature-specific test suites.
  • Quick and scalable – Automation can run tests considerably faster. Large data volume is of no concern.
  • 24×7 testing – 24×7 testing can be quite tedious for humans. Sometimes the product needs to be tested continuously for more than 24 hours to verify its functionalities and figure out inconsistencies. Automation is ideal in such a situation, and enables testers to monitor the results on the go.


Disadvantages of automation in QA


  • It’s programmed – Automated tests only check what they are programmed to test. It may not detect certain flaws if it’s not programmed to, and still give a ‘pass’ for the tests. Without specifically prepared test cases, automation is not a good idea.
  • Can never replace human intelligence – Test automation is not exactly testing. Testing an exercise that requires deep domain knowledge, experience, and an ability to think out-of-the-box. If the box is a program, automated testing can only think inside the box. It only executes a set of pre-defined test cases to compare the results with expected results. Humans on the other hand can apply proper test cases if they find any anomalies in the product’s behavior.
  • High maintenance – To get the best out of automated testing, it needs to be constantly updated. Unless the test packs are up-to-date or relevant, the tests will start failing eventually. Hence, automation requires a lot of time and resources for maintenance.



By now, it should be clear that there are clear merits and demerits to automated testing. However, it can obviously enhance QA efficiency significantly. But for the best results, automated testing should be implemented along with a solid manual testing strategy.

Combining both automated and manual testing will yield better results and ensure a great experience for the end user.

Written by: Suraj Jayaram

Developing a technology solution is quite complicated nowadays. For starters, it requires thorough research on technology trends. Once you begin, you will be hearing the term ‘Cloud’ a lot; a service that’s gradually becoming the norm in the business world. There’s a lot to learn about cloud computing, and not everything can’t be covered in a single article obviously.

This blog will serve as a guide to using the three main categories of cloud computing:

  • Software as a Service (SaaS)
  • Platform as a Service (PaaS)
  • Infrastructure as a Service (IaaS)

So what exactly is cloud computing?

To answer this, you need to know what Cloud is. In a nutshell, the cloud is basically a sophisticated infrastructure technology. The important constituents include interconnected servers, databases, and computers. Multiple users can use the cloud but only based on their individual access permissions.

The main traits of cloud technology include:

  • A simple sign up is all that is required to avail the cloud service unlike traditional IT services.
  • Accessible across multiple platforms including mobile, laptops and desktops.
  • Billed only for the usage (pay-as-you-go model)
  • Scalable and flexible to meet your demands
  • Multiple users can share resources uninterrupted

Let’s get into the details.


The most popular of all cloud services, SaaS might be familiar to you in many different forms including Google Apps, Netflix, DropBox etc.


  • Accessible through web browsers
  • Generally hosted on remote servers so the users need not be concerned about hardware upgrades, software updates and patches
  • Integration with third party applications done using APIs
  • Application is managed from a central location

Where it’s suitable

  • If your applications considerably raises or reduces demand
  • If your applications are meant to be accessible through web or mobile
  • For short term projects where you will be billed only for your usage
  • For startups that want to launch their websites without hassle



Like the name suggests, PaaS or ‘Platform as a Service’ provides a platform for creating the software which is later delivered over the web. PaaS and SaaS share a few common traits, though the former offers a platform for the developers to work on. The benefit of this is that the developers will be able to focus on developing the software without being concerned about storage, load balancing, operating system, software updates etc.


  • Based on virtualization technology, which means you can scale up or scale down resources as per your requirements
  • Integrated databases and web services
  • There are tools to manage billing and subscription
  • Provides services that facilitate software development, testing and deployment
  • Multiple users can utilize a single developmental application

Where it’s suitable

  • If the development process demands speed and flexibility, and if multiple developers and third parties are involved
  • If agile methodology is practiced in software development, PaaS makes it easier for developers to overcome the challenges associated with the methodology
  • Large organizations can use PaaS if they intend to personalize applications
  • If the organization wants to reduce overhead costs by utilizing PaaS’s infrastructure



IaaS or ‘Infrastructure as a Service’, like the name suggests, provides a cloud-based infrastructure that includes storage, servers, network and operating systems based on the demands. This is a unique service that allows organizations to procure only the resources they need (as a service) rather than purchase the whole infrastructure.


  • Resources can be purchased as a service
  • Dynamic scalability priced based on the infrastructure
  • Multiple users on a single piece of hardware
  • Great control over the infrastructure
  • One of the most flexible cloud computing models

Where it’s suitable

  • If the organization needs complete control over the applications they use
  • For startups who wants to go live faster without having to invest time in procuring the hardware and software
  • For applications that need to be scaled up based on traffic spikes
  • For organizations that are uncertain about an application and how its evolution will benefit them in the future



Everything mentioned above should have made one thing clear to you – each of the three cloud computing models offer unique features and functionalities. So the choice actually depends on the requirements of your business. The benefits are obvious, and as mentioned before, cloud will eventually be the norm.

Written by: Prashant Thomas

Another one of those popular online tech debates that started over a year ago when PHP7 launched. PHP didn’t take much time to become one of the biggest assets for open source development. But PHP5 eventually got a rival in the form of HHVM (HipHop Virtual Machine), a virtual tool developed by Facebook that can execute PHP code far more effectively than other tools.

The contest took a turn when the latest version of PHP – PHP7 arrived. It outclassed its predecessor in many ways.

And so the showdown began. To come to a conclusion, we need to see what the contenders brought to the table.


Developed by tech giant Facebook in 2010, and released in 2011, the main purpose of HHVM was to reduce server resources so as to deliver a better experience for Facebook users. It does just that by converting PHP code into machine code quickly and efficiently. However, there are merits and demerits that we need to consider.


  • Just in Time (JIT) compiler – the main reason how HHVM surpassed PHP5 in performance
  • Any developer can install and work with it
  • Doesn’t require much memory space to execute requests
  • Hack – an internally developed programming language that allows programmers to use static as well as dynamic typing. Considerably improves compilation speed.
  • Reliable community support for the developers to keep improving the tool.
  • Websites can run faster in most situations due to dynamic translation.


  • Custom extensions should be converted prior to adding them
  • Does not support all WordPress plugins and themes
  • Designed for high-end machines, and will not work on 32 bit operating systems
  • Requires a lot of memory. The machine running it will need at least 2 GB RAM.
  • If Facebook ceases HHVM support, the large development community will soon disappear.



PHP6 development took too long, and was riddled with many issues consequently giving it a bad reputation before its release. PHP6 eventually ended up in the trash, and was replaced by PHP7 that delivered a massive performance boost over PHP5.


  • Many new programming language features including new operators, return type hinting, uniform variable syntax etc.
  • Streamlined internal data handling made it almost 4 times more memory efficient than PHP5.6
  • Established communities that offer technical support and troubleshooting assistance.
  • Quicker response to requests with double the request handling limit compared to PHP5.6


  • Lacks support for many PHP4 features
  • HHVM is better when it comes to certain performance metrics
  • Does not support some application plugins and themes
  • Not recommended for high-traffic websites

Now, let’s get ready to rumble.

You can find performance comparisons of the two all over the internet. But it probably won’t help you get to a conclusion. Both contenders definitely outperform the older PHP versions. However, in certain instances one is better than the other. The methodology should be the critical factor here when you are assessing benchmarks.

The way they interpret PHP code is fundamentally poles apart. PHP7 uses PHP interpreter to translate and execute PHP codes. HHVM converts PHP code to HipHop bytecode, which is then interpreted and optimized as machine code and executed.

Benchmark tests by Kinsta in WordPress showed HHVM processing and executing 624 requests per second while PHP7 managed to execute only 604 requests. However, PHP7 had a more distinct advantage in Drupal 8, evident from the benchmark test. PHP7 managed 37% more server requests per second than HHVM. [Source: Kinsta]

Tests on various CMSs including Magento 2 community edition, Pyro CMS etc. indicated that HHVM is a better performer than PHP7. PHP7 was faster in Laravel 5.1 though.

When it comes to reduced overall memory usage, PHP7 takes the prize. HHVM demands comparatively more memory, in turn constraining server selection. Nevertheless, HHVM WordPress websites still consumed fewer CPU resources. However PHP and HHVM showed evenly matched performance on site load time tests.

Conclusion and Verdict

If you have used PHP5, you will easily notice the performance improvements in PHP7. However, your choice will ultimately depend on your requirements and your capacity. For high traffic websites running on multi-core servers, HHVM is the right choice. If you are just looking for something significantly better than PHP5, PHP7 is the way to go.

Software development is at its best if it helps a lot of people or a huge community. Keeping that in mind, HHVM basically is a more selfish alternative. It’s not shared much for the community to use it the way they want to. PHP, on the other hand, is a tool that’s constantly being improved for everyone to use, and is a more generous alternative. Experts claim that the support PHP is getting now, and the small pieces of incremental enhancements should help PHP7 emerge the victor of this bout eventually in terms of performance.

Verdict: HHVM wins for now. But a future rematch will most likely have different results.

Written by: Ajeesh Azhakesan

DevOps gained a lot of traction last year, with many companies deciding to adopt the increasingly popular software development methodology.

So what is it exactly?

As mentioned before, it’s simply a software development methodology that combines, like its name suggests, Development and Operations. What it does is basically integrate every software development operation from development to deployment and delivery.

The principle behind it is a close and effective collaboration between all the stakeholders who are part of the software development process. It’s just that the emphasis is on combining development and IT operations, which brings us to our subject.

DevOps is all about continuous development, continuous integration, and continuous delivery. This generally gives the users a wrong notion that DevOps culture reduce the need for testers. But that’s not the case. DevOps assures that the product delivered meets the requirements of the customers, and is of the best quality. That means, software testing will be contributing a lot more.

DevOps without Integrated Testing

Organizations that adopted DevOps did it to achieve smooth and seamless operation of a continuous delivery model. A DevOps environment that doesn’t count on what QA can do will ultimately fail to achieve that ‘smooth and seamless’ operation. If the developers take their time doing just the unit test to check the functions alone, and then pass the responsibilities over to the QA to continue, it clearly creates a rift between development and operations which in turn will hinder the progress to meeting continuous delivery goals.

Role of Testers in a DevOps Culture

A successful DevOps environment will have testers involved in every phase of the development process. The QA and development operations should be closely knit, i.e. they should operate in parallel. Testers should work with developers during coding itself instead of waiting for the developers to provide a coded product. The developers, on the other hand, will have to share early testing responsibilities with the tester while the QA team determines and acquires the most effective tools and technologies to ensure that early testing is performed with the least disruption.

This approach actually benefits if the team faces challenges and failures. The developers and the testers can work together to overcome the challenges and rectify the issues, while ensuring that the requirements of the customer have been met.

The testers basically:

  • Use automation to make their job a bit easier
  • Integrate development and QA operations to increase stability in the continuous delivery phase
  • Use automated tools for development infrastructure acquisition and provisioning
  • Discover defects and evaluate customer requirements


Continuous Testing

One of the major benefits of early involvement of testers is that they will be able to implement test scripts and determine test cases while the coding’s being done. Experienced testers can provide immediate feedback to developers on the defects identified during the development phase. In addition, they will also get an idea on the effects of their implementation choices on later stages of the delivery pipeline.

So to conclude, testing should be ubiquitous, as in ‘present’ or ‘integrated’ in all the central processes in a DevOps environment.

Written by: Suraj Jayaram
Page 10 of 11« First...891011