Friday, 28 January 2011

Mike Gualtieri's blog | Forrester Blogs

Before Java was invented, one of the key industry trends was to increase the productivity of both developers and end users. For example, fourth-generation programming languages (4GL) such as Powerbuilder, Progress, and Uniface provided professional developers with faster ways to develop business applications than using COBOL, Pascal, C, or C++. For end users, tools such as Dbase, Lotus Notes, and Visicalc provided them with the unprecedented ability to create mini-apps without the need for professional developers. In the early '90s, this productivity trend was thrown into a tizzy by the Internet. Now, software vendors and enterprise application developers had to rush to write a whole new generation of applications for the Web or risk becoming irrelevant. The Internet forced developer productivity and 4GL’s to take the back seat.

Java Was At The Right Place At The Right Time For Web Applications

Java was designed in 1990 as an easier and more portable option than C++ to develop embedded systems.  The invention of the WWW in 1993 started a meteoric change in IT application development. Sun Microsystems moved quickly to take advantage by selling “network” servers like hotcakes and offering Java as the platform for Web development. Most other software vendors were caught off guard and Java became the de facto Internet development standard for enterprise Web application development.

Fast-Forward 20 Years

Read more

Categories: