screwupcolor1

Web Services Architecture – When to Use SOAP vs REST

SOAP (Simple Object Access Protocol) and REST (Representation State Transfer) are popular with developers working on system integration based projects. Software architects will design the application from various perspectives and also decides, based on various reasons, which approach to take to expose new API to third party applications. As a software architect, it is good practice to involve your development team lead during system architecture process.
This article, based on my experience, will discuss when to use SOAP or REST web services to expose your API to third party clients. 

Web Services Demystified

Web services are part of the Services Oriented Architecture. Web services are used as the model for process decomposition and assembly. I have been involved in discussion where there were some misconception between web services and web API.
The W3C defines a Web Service generally as:

 

A software system designed to support interoperable machine-to-machine interaction over a network.

 

Web API also known as Server-Side Web API is a programmatic interface to a defined request-response message system, typically expressed in JSON or XML, which is exposed via the web – most commonly by means of an HTTP-based web server. (extracted from Wikipedia)

Based on the above definition, one can insinuate when SOAP should be used instead of REST and vice-versa but it is not as simple as it looks. We can agree that Web Services are not the same as Web API. Accessing an image over the web is not calling a web service but retrieving a web resources using is Universal Resource Identifier. HTML has a well-defined standard approach to serving resources to clients and does not require the use of web service in order to fulfill their request.

 

Why Use REST over SOAP

Developers are passionate people. Let’s briefly analyze some of the reasons they mentioned when considering REST over SOAP:

 

REST is easier than SOAP

I’m not sure what developers refer to when they argue that REST is easier than SOAP. Based on my experience, depending on the requirement, developing REST services can quickly become very complex just as any other SOA projects. What is your service abstracting from the client? What is the level of security required? Is your service a long running asynchronous process? And many other requirements will increase the level of complexity. Testability: apparently it easier to test RESTFul web services than their SOAP counter parts. This is only partially true; for simple REST services, developers only have to point their browser to the service endpoints and a result would be returned in the response. But what happens once you need to add the HTTP headers and passing of tokens, parameters validation… This is still testable but chances are you will require a plugin for your browser in order to test those features. If a plugin is required then the ease of testing is exactly the same as using SOAPUI for testing SOAP based services.

 

RESTFul Web Services serves JSON that is faster to parse than XML

This so called “benefit” is related to consuming web services in a browser. RESTFul web services can also serve XML and any MIME type that you desire. This article is not focused on discussing JSON vs XML; and I wouldn’t write any separate article on the topic. JSON relates to JavaScript and as JS is very closed to the web, as in providing interaction on the web with HTML and CSS, most developers automatically assumes that it also linked to interacting with RESTFul web services. If you didn’t know before, I’m sure that you can guess that RESTFul web services are language agnostic.
Regarding the speed in processing the XML markup as opposed to JSON, a performance test conducted by David Lead, Lead Engineer at MarkLogic Inc, find out to be a myth.

 

REST is built for the Web

Well this is true according to Roy Fielding dissertation; after all he is credited with the creation of REST style architecture. REST, unlike SOAP, uses the underlying technology for transport and communication between clients and servers. The architecture style is optimized for the modern web architecture. The web has outgrown is initial requirements and this can be seen through HTML5 and web sockets standardization. The web has become a platform on its own right, maybe WebOS. Some applications will require server-side state saving such as financial applications to e-commerce.

 

Caching

When using REST over HTTP, it will utilize the features available in HTTP such as caching, security in terms of TLS and authentication. Architects know that dynamic resources should not be cached. Let’s discuss this with an example; we have a RESTFul web service to serve us some stock quotes when provided with a stock ticker. Stock quotes changes per milliseconds, if we make a request for BARC (Barclays Bank), there is a chance that the quote that we have receive a minute ago would be different in two minutes. This shows that we cannot always use the caching features implemented in the protocol. HTTP Caching be useful in client requests of static content but if the caching feature of HTTP is not enough for your requirements, then you should also evaluate SOAP as you will be building your own cache either way not relying on the protocol.

 

HTTP Verb Binding

HTTP verb binding is supposedly a feature worth discussing when comparing REST vs SOAP. Much of public facing API referred to as RESTFul are more REST-like and do not implement all HTTP verb in the manner they are supposed to. For example; when creating new resources, most developers use POST instead of PUT. Even deleting resources are sent through POST request instead of DELETE.
SOAP also defines a binding to the HTTP protocol. When binding to HTTP, all SOAP requests are sent through POST request.

 

Security

Security is never mentioned when discussing the benefits of REST over SOAP. Two simples security is provided on the HTTP protocol layer such as basic authentication and communication encryption through TLS. SOAP security is well standardized through WS-SECURITY. HTTP is not secured, as seen in the news all the time, therefore web services relying on the protocol needs to implement their own rigorous security. Security goes beyond simple authentication and confidentiality, and also includes authorization and integrity. When it comes to ease of implementation, I believe that SOAP is that at the forefront.

 

Conclusion

This was meant to be a short blog post but it seems we got to passionate about the subject.
I accept that there are many other factors to consider when choosing SOAP vs REST but I will over simplify it here. For machine-to-machine communications such as business processing with BPEL, transaction security and integrity, I suggest using SOAP. SOAP binding to HTTP is possible and XML parsing is not noticeably slower than JSON on the browser. For building public facing API, REST is not the undisputed champion. Consider the actual application requirements and evaluate the benefits. People would say that REST protocol agnostic and work on anything that has URI is beside the point. According to its creator, REST was conceived for the evolution of the web. Most so-called RESTFul web services available on the internet are more truly REST-like as they do not follow the principle of the architectural style. One good thing about working with REST is that application do not need a service contract a la SOAP (WSDL). WADL was never standardized and I do not believe that developers would implement it. I remember looking for Twitter WADL to integrate it.
I will leave you to make your own conclusion. There is so much I can write in a blog post. Feel free to leave any comments to keep the discussion going.

primefaces-mobile-weather-app1

Primefaces Mobile – Weather App Example

As a Java developer, I usually get requests of building mobile apps. I like building Java application; enterprise, web and mobile. The latest projects that I have been involved with make heavy use of JSF and Primefaces in particular.

I am quite confortable with JSF and therefore I decided to build a mobile application using JSF and Primefaces Mobile. Primefaces Mobile wraps JQuery as a JSF component so that you do not have to write any Javascript. This approach has huge benefits: JQuery is a well tested framework used by large companies such as Google and Microsoft.

I will make this post quite brief. I wanted to know how easy it would be to recreate the demo from the Primefaces Mobile labs page.

Here is a screenshot of the final application running in Firefox 11 (Windows 7 64 Bit).

This was a simple example taken directly from the website therefore I was expecting to a short exercise to make build and make it run. Well not so fast.

I created a Maven based Java EE 6 project using Netbeans IDE 7.1.1. After browsing Google for a bit, I spent a few minutes getting the right repository and dependency in place.

Once the dependency where in the place, I had to create the beans required for the actual JSF page to work. I found the Primefaces Mobile backing beans on Google Code. So now I had everything setup and running. At first glance, the application seemed to running fine and working. Then I tried the application on my iPad and the Android Emulator, and nothing was working. The user interface was displayed but the “get forecast” button was not making any Ajax calls.

So I started to debug the application everytime I had some spare time. I also noticed that, while running on a desktop browser, the application would be able to an Ajax called and updated the screen with the values (see screenshot above) but you wanted to make another to find out the temparature of let’s say London, the nothing would actually unless you refresh the page and try again. OK, so it’s not working as expected but the example on Primefaces Mobile labs worked fine on my iPad, emulator and desktop browser (IE 9 excluded).

So I ran the application using the NetBeans debugger and decided to look through Firebug. The first call goes through and stops at the breakpoint but subsquent calls do not even get to the managed bean. Firebug shows that the other code are being retrieve from the cache. I manually set all the HTTP headers so that it does not cache any content but this is still the same result NOT WORKING!

I have uploaded the code to GitHub click on the link to download it.

In conclusion, it’s not plug-and-play to make the examples on Primefaces.org labs work. The documentation for the examples are quite poor and I hope that the good folks from Primefaces can take look at my code and tell us what I am doing wrong.

A part from that, Primefaces is a good JSF framework that I use on a daily basis on multiple projects therefore I cannot really put them down but I wished the mobile examples work and tell us what is needed to make it work.

Please share your experience in the comments below or advice on how to fix it. Feel free to download the code from GitHub and have a look to.

Happy coding 🙂

happy20101

Happy New Year Techies!!!

I just wanted to say happy new year to the community, without you guys, we would still be living in the dark ages. Here are a few that I am looking forward to in the new year:

  1. Oracle Sun merger:- The untold future of NetBeans, MySQL, Swing and every open source that Sun has been working on in the past. What would happen to Sun Open Source (SOS!!!) Movement.
  2. Looking forward to JavaFX1.3 release, Authoring tool and improved Composer plugin.
  3. Android uprise against iPhone (not because I can easily write Android based application) and hopefully with Visual XML builder to build Android UI. JavaFX running on Android, anybody????
  4.  Looking forward to seeing real-world JavaFX application and possibly a showcase linked to the JavaFX.com site.
  5. Java EE 6 support in Google App Engine
  6. Google Wave open to the public and how well it does against Twitter and Facebook
  7. Java Store open to Europe (well this is where I am based and want to sell applications not just provide free).
  8. My new video blog ( well that’s me trying to be more Armel 2.0 – the sequel) to better engage with the community.
  9. looking forward to networking with fellow developers and techies.
  10. looking forward to the new buzzwords (old technology, new name)

Well, 2009 is was good year for Java and tech scene. I hope that you all enjoyed the year as much as I did. Why not share what wish list with me (comment box). Subscribe to follow my blog as I will try to bring more interesting articles in the new year.

Happy 2010 New Year, wish you all; success, properity, fame (yup) and fortune. Don’t forget, if you need a server-side, UI, Android or JavaFx developer, just mail me.

4099041220_b0ccec7f10

Develop High Transaction Web Applications with Java MySQL & NetBeans

This entry is a brief tutorial on how to develop high transactional web application. I will be looking at how to develop a high transactional application while maintaining data integrityFor the purpose of keeping this entry simple, I will be using some RAD tools, the NetBeans IDE, to generate most of the code.

In order to follow the tutorial, you will need the following:

  1. NetBeans IDE 6.7+
  2. Java JDK 1.6+ (my version is 1.6.0_17)
  3. MySQL 5.1+
  4. MySQL Sakila database
  5. Apache Tomcat 6.x
  6. An understanding of JPA transactions and ReSTful web services

This is my definition of High Transaction Applications:

A high transaction application is one that can serve multiple simultaneous requests from clients and keep them secure from each other. The application has only two purposes: read or write data to/from a repository either JMS or DB. The transactions have to meet the ACID criteria in order to be deployed in the real world.

Ok, the above is my definition and you are free to redefine it. I am going to build a web application that will be an n-tier application:

  • Database back-end (MySQL)
  • Entity classes
  • ReSTful Web Services to allow other developers to integrate the application with theirs
  • A web based front-end

For the simplicity of the article, I will not implement any security such as user or application level security (authentication and database table privileges). This tutorial is mostly geared toward the newbies but I am sure that more advanced developers would benefit too.

Let’s get coding.

  1. Make sure that you have loaded the MySQL Sakila database into your MySQL database. You can download the Sakila database from the NetBeans plugin centre (see here).
  2. Create a new web application and name whatever you like. I have named mine “WebApplication”. I will now refer the application as WebApplication. Make sure to choose Tomcat as your deployment server
  3. Add the MySQL driver “mysql-connector-java-5.0.7-bin” to the WebApplication libraries. NetBeans will work and connect to the DB even without the driver but once you deploy your application to a server, the application will not be able to connect to the DB and throw a ClassNotFoundException com.mysql.jdbc.driver.
  4. We are going to develop the back-end first. NetBeans makes it very easy for us to create Entity classes from database. I would recommend newbies to learn how to manually create Entity classes and configure the persistence.xml file. This tutorial makes use of JPA but one should ignore the drawbacks of JPA/ ORM frameworks.
  5. Right click on the project name and choose Entity Classes from Database… On the next screen choose the “filmActor” table and click on the Add button. Make sure that “Include Related Tables” box is checked underneath the Selected Tables panel. The screenshot does not show the “filmActor” table as I have previously generated the Entity class but I am sure you get the idea.

  6. If you are required to create a Persistence Unit, click on the persistence button -> you can accept the default name -> choose your Persistence Library -> choose your database connection -> choose “None” for Table Generation Strategy -> click Finish
  7. After choosing the table to generate the Entity from click next -> fill Package name -> tick Generate Name Query… -> click Finish
  8.  

    NetBeans generates all the Entity classes based on the database table that you have chosen. The next thing that we want to do is generate a set of ReSTful web services from the generated Entity classes. Again, NetBeans facilitates the work for us (it is important that you also know how to create the classes manually or you will not know how to debug them if there is any problem in the future).

  9. Right click on the project name “WebApplication” -> RESTful Web Services from Entity Classes… -> Choose the Entity Classes that you would like to generate the WS for and click add or add all -> click next -> on the following screen, accept the default values and click Finish

  10. You can go ahead and test your ReSTFul Web Services by right clicking on the name of the application and click on Test RESTful Web Services

  11. The previous step will launch your web browser within which you can test your web services (click on the node on the left and see the queries on the right)

  12. Back in the NetBeans IDE; right click on the project name “WebApplication” -> click on JSF Pages from Entity Classes… (The JSF pages will not use the web services as there are packaged together with the Entity Classes. This will improve performances and still allow external applications to integrate). Choose the Entity Classes that you would like to generate the pages for -> click next. On the final screen, fill in the package name for the JPA Controller and JSF Classes -> click finish

    NetBeans will generate the necessary files to create a CRUD application with a user interface. I suggest that you familiarize yourself with the generated code.

  13. Expand the Configuration folder under your project name “WebApplication” and the web.xml file. At around line 38, change the content of the welcome-file-list to look as follow
    <welcome-file-list>
            <welcome-file>faces/welcomeJSF.jsp</welcome-file>
    </welcome-file-list>

    This will make the generate JSF page to be the landing page for the application when requested. Make sure that you do have a “welcomeJSF.jsp” file before making the change.

  14. Right click on the application name -> Run. The application should load in your web browser. Now go on, play around with the application. And why not create a client to send request to the web services? (not today)

You can load test your application by using Apache JMeter. It is easy to run and configure. If you want to take a look at how JPA implements the ACID features, browse to your controller classes. Here is a short introduction to JPA transaction.

Hope you enjoyed and if you need any clarification, just leave me a comment and try to get back to you ASAP (if time permits).

Develop Your Own Google with Apache Lucene (Java Nutch Solr)

Apache Lucene is Open Source API that allows a Java developer (.Net libraries available) to write indexing and full-text search capable applications. I have been writing applications based on Lucene for the last 3 years and some of the applications have been deployed at large corporations. I know there are other libraries available to developers who wish to write indexing engine but this blog will solely focus on Apache Lucene. I will not compare it to other API.

Lucene is a very mature API and can be found in NetBeans IDE, Liferay, JackRabbit among others. IBM has written a very good document about the Lucene architecture, therefore I will not delve into it here.

Lucene, alone, is pretty much useless as any other API. Let’s now introduce Nutch. Nutch is a web crawler built-on top of Lucene to provide file crawling capabilities. Nutch was designed to handle large amount of data from the internet (http). Due to its plugin architecture, it was later extended to provide local network crawling such as FTP, databases and Microsoft Windows Shares (I am the author of the protocol-smb plugin and co-author of the index-extra plugin found on the Nutch site). We had extended Nutch and turned it into an Enterprise Search application but most of the source codes were locked behind closed doors (company politics).Anyway, Nutch has evolved to become but still very complex in its inner working. The initial Nutch was developed to process data in a batch but there are ways to turn it real-time but that’s for another day. Ok, so Nutch is good for crawling and indexing of data but it does not handle search directly. There is a web application available with Nutch but it is quite poor so let’s now introduce the Solr.

Solr is a powerful web-based search server built-on top of Lucene. The application was developed by CNET Networks and donated to the Apache Foundation. I believe, not too sure so I might need some references here, Solr was powering the search feature on their site but it is definitely used internally by the company. Late 2009, Lucid Imaginations receives $7.5 million in funding to provide commercial services built around Solr (and Lucene possibly?). Here is a very good presentation about Solr. Solr is a very good indexing engine. The keyword here is “indexing engine”. It does not have any support for crawling data therefore requiring the developer to create applications that will feed it the data to index. I do believe that it is a good feature of the application as it gives the ability to integrate with various systems as long as they can post data over HTTP.

Nutch is a good crawler but it does not provide an enterprise-grade search interface to its data. Solr, in the other hand, is powerful indexer and has an enterprise grade search interface but it does not know how to gather data in its own. I am sure by now it has become obvious how we can integrate them both together.

We want Nutch to gather the data, by-pass its indexing cycle and feed the data directly to Solr. Lucid Imagination has a good tutorial about it here.

After reading the tutorial from Lucid Imagination, you will notice that Nutch is run by executing some bash files. This is something I strongly disagree with. If Nutch is based on Java (an OS independent language), why do we need to execute some UNIX/LINUX shell script. Also, the fact we need to install CygWin on MS Windows platform to be able to run is a big negative for me. I wrote a simple Java application that will launch Nutch and send the indexing to Solr but as you can see in the source code, you still need a UNIX like environment to run successfully. You can write a platform independent version by looking up Nutch API and calling the methods directly.

Well, I hope that this entry help you understand how to use Nutch and Solr built-on top of Apache Lucene. If you need any clarification, leave comments and I will try to gave ASAP if time permits.

package com.etapix.nutchsolr;

import java.io.BufferedReader;
import java.io.File;
import java.io.FileFilter;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.util.logging.Level;
import java.util.logging.Logger;

/**
 *
 * @author Armel Nene
 */
public class Indexer {

    public static void main(String args[]) {
        if (args.length < 3) {
            System.out.println("Usage:" +
                    "ncrawlName        -   This will be used to store crawler files in CrawlName directory" +
                    "nurlFolder        -   The path to the folder containing the URL to crawl" +
                    "nsolrUrl          -   The URL to the Solr server");
            return;
        }
        String crawlerName = args[0];
        String urlFolder = args[1];
        String solrUrl = args[2];
        String inject = "bash bin/nutch2.sh inject " + crawlerName + "/crawldb " + urlFolder;
        String generate = "bash bin/nutch2.sh generate " + crawlerName + "/crawldb " + crawlerName + "/segments -topN 10 -numFetchers 5";
        String export = crawlerName + "/segments/";

        String invertLinks = "bash bin/nutch2.sh invertlinks " + crawlerName + "/linkdb -dir " + crawlerName + "/segments";
        String indexSolr = "bash bin/nutch2.sh solrindex " + solrUrl + " " + crawlerName + "/crawldb " + crawlerName + "/linkdb " + crawlerName + "/segments/*";
        try {
            System.out.println("Injecting URLs in crawldb");
//            int state = 0;
            InputStream in = Runtime.getRuntime().exec(inject).getInputStream();
            System.out.println(convertStreamToString(in));


//            state = Runtime.getRuntime().exec(inject).waitFor();
//            System.out.println("process completed: " + state);

            for (int i = 0; i < 3; i++) {
                System.out.println("Generating segments");
                in = Runtime.getRuntime().exec(generate).getInputStream();
                System.out.println(convertStreamToString(in));

                System.out.println("Setting environment variable $SEGMENT");
//            String segs = convertStreamToString(Runtime.getRuntime().exec("ls -tr " + crawlerName + "/segments|tail -1").getInputStream());

                String segments = export + lastFileModified(export).getName();
                System.out.println("$segments: " + segments);
//            in = Runtime.getRuntime().exec(export + segs).getInputStream();
            System.out.println(convertStreamToString(in));

                String fetch = "bash bin/nutch2.sh fetch " + segments + " -noParsing";
                String parse = "bash bin/nutch2.sh parse " + segments;
                String update = "bash bin/nutch2.sh updatedb " + crawlerName + "/crawldb " + segments + " -filter -normalize";

                System.out.println("fetch segments");
                in = Runtime.getRuntime().exec(fetch).getInputStream();
                System.out.println(convertStreamToString(in));

                System.out.println("Parse segments");
                in = Runtime.getRuntime().exec(parse).getInputStream();
                System.out.println(convertStreamToString(in));

                System.out.println("Update crawldb");
                in = Runtime.getRuntime().exec(update).getInputStream();
                System.out.println(convertStreamToString(in));
            }
            System.out.println("Inverting links");
            in = Runtime.getRuntime().exec(invertLinks).getInputStream();
            System.out.println(convertStreamToString(in));

            System.out.println("Indexing contents to Solr " + solrUrl);
            in = Runtime.getRuntime().exec(indexSolr).getInputStream();
            System.out.println(convertStreamToString(in));

        } catch (Exception ex) {
            Logger.getLogger(Indexer.class.getName()).log(Level.SEVERE, null, ex);
        }
    }

    public static String convertStreamToString(InputStream is) {
        /*
         * To convert the InputStream to String we use the BufferedReader.readLine()
         * method. We iterate until the BufferedReader return null which means
         * there's no more data to read. Each line will appended to a StringBuilder
         * and returned as String.
         */
        BufferedReader reader = new BufferedReader(new InputStreamReader(is));
        StringBuilder sb = new StringBuilder();

        String line = null;
        System.out.println("Now converting inputstream to text");
        try {
            while ((line = reader.readLine()) != null) {
                sb.append(line + "n");
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            try {
                is.close();
            } catch (Exception e) {
                e.printStackTrace();
            }
        }
        System.out.println("Finish converting to text");
        return sb.toString();
    }

    public static File lastFileModified(String dir) {
        File fl = new File(dir);
        File[] files = fl.listFiles(new FileFilter() {

            public boolean accept(File file) {
                return file.isDirectory();
            }
        });
        long lastMod = Long.MIN_VALUE;
        File choice = null;
        for (File file : files) {
            if (file.lastModified() > lastMod) {
                choice = file;
                lastMod = file.lastModified();
            }
        }
        return choice;
    }
}

Top-10

10 Things all Java developers should know

Since Java (I know it’s not an acronym, but it stands out like that) was officially introduced in 1995, it has changed the way most of us look at the Operating System. Bill Gate (how ironic) once said that it was not about the hardware but the software which will be the future. A decade or more later, the fifth employee of SUN, John Gage said “The Network is the Computer”. Fast-forwarding to the 21st century and John seemed to be right. Anyway, Java was built not to depend on an Operating System and deployed through the network. Java through its applet technology gave birth to Rich Network Application aka Rich Internet Application (RIA). Java is not perfect; or we would not have various releases and more on the way, but Java has given birth to a wide range of programming language (just Google it to find out more).

Without further ado, I am going to get back to subject. This is a brief article on what I believe that every Java developers should know regardless of their experience. I do not personally believe that someone with 5 years experience is not as good as someone with 10 years experience. We all develop our own methods of working but as a developer you need to stay abreast of your technology. So, here are my top 10 not in order of importance (or?):

  1. Remember the basic of Java language and OOP paradigm.
    Most experience developers seem to forget the theory behind the language. I am not saying that they are not good at their job but can they explain to junior developers why they have used interfaces instead of abstract classes or why implement a pattern over another one? As a programmer, you become very arrogant as you believe that you write the best code but in the real world, people work in teams with different skill set and experiences. It is important that you can backup your actions/ codes. A very simple question such as; when should I use a String object instead of a StringBuilder/ StringBuffer? You might take this question lightly but can you actually tell someone else the difference?
  2. Know your technology stack
    All developers have to know their technology stack. What does it mean? Java is not like other languages; Java has subsets such J2ME and superset such as Java EE. We have our own area of expertise but it is important to know the differences between the various sets of Java. Some basic questions such as the differences between SWING, Applet, Servlets, EJBs and JavaFX will boost your confidence. Most developers do not know how to tweak the JVM and the differences between the JRE and the SDK environment. Do you know why you need the SDK to be installed to run Tomcat but you only need the JRE to run an application?
  3. Experiment with various Java EE framework
    I am not asking you to be an expert in every single Java EE framework but it will make the difference if you are familiar with Spring and EJB. That should actually be the de facto framework that should be on every developers CV. Developers should know the difference between Java EE 5 (soon 6) and Spring. Hibernate is also brilliant and it’s used for data access but all developers should have moved to JPA by now. Hibernate also comply with JPA therefore there is no more excuses.
  4. Know a scripting language
    Java can be heavyweight for some simple tasks which can be implemented using a simple dynamic language such as Python, Perl(?) and others. I would also recommend to developers to learn shell scripting on their target OS.
  5. Know how to develop web services
    The network is the computer, therefore it is important to know the different web services framework available. Data are integrated through web services and opening your services to the “cloud”. SWING developers will probably not develop web services but I am sure that they will be connecting to data through web services clients. Understanding the difference between the standardised SOAP and non-standardised ReST will help you choose which is best to implement your services.
  6. Know how and when to multithread your application
    I have to put that in there. Developers should know when and why to multithread an application, thread inter-communication and monitoring. All developers, junior or not, should know how to write a multi-threaded application.
  7. Database development using JDBC and JPA
    This should be a development law. All developers should know how to write SQL queries and how to create databases. All enterprise applications store data in some sort of relational database systems and it is therefore imperative that this knowledge should be of second nature. Java EE 5 introduced JPA (JDO was there before) but it is not applicable to all situation. Hence, knowing the difference and when to implement one instead of the other is important.
  8. Know a client side scripting language and what is AJAX
    The network is the computer and Internet is the deployment platform. Java EE and its various framework are server side executiong which can put extra “load” on the server. If you are looking to move a cloud based system where the providers charges you per resources used, it might be wise to move some of the execution to the client side. AJAX has been buzzing the scene for the last 3 years and more. This is not a technology but a new way of doing something that already existed. There are numerous Java AJAX framework such as GWT and DWR which makes it easy to develop AJAX based application which are compiled to JavaScript. Developers should also know what is the AJAX theories.
  9. Know your competitors and do not take part on “what is the best IDE” discussion
    Java is not the only language that can do what it does. I think that Java is more mature and complete as opposed to other languages. Knowing the difference between Java and .NET or Java and Ruby is a good asset to have. You also need to know when and why to use one instead of the other. Please please please, do not get into “my IDE is better than x because…” discussion as it is good for the Java community to have multiple IDEs and framework available to use. Every tools have their place as for example JDeveloper is better than x if you are going to solely develop on an Oracle stack and etc…
  10. Know ANT (MAVEN?), TOMCAT and any other mainstrean application server
    ANT is the de facto build script for Java and its IDE-based development. Maven is becoming popular and soon can be as popular as ANT (not sure of its popularity in the financial sector). TOMCAT  should be immortalised as the based servlet container that all developers should be familiar with.

There are alot more to add but this is just some of the basics that I think all developers should have in their repertoire. Feel free to add to this list in the comments box. If I could had another one to this, would be; all developers not just Java, should know how to search the web and Google is your best friends (now support my advertisers by clicking on the links on the right 😉 ). I hope you enjoyed the entry and feel free to comments good or bad!!! are welcome.

4064537510_e727865913

ReST Web Services on Google App Engine using NetBeans 6.7

When Google announced early this year the availability of the Java language on their cloud system “App Engine”, this was turning point for Java on the web. The reason I am saying that is due to a simple questions; how many hosting providers support Java? And providing free hosting for that purpose… This was a good move in the direction of the JAVA community and the vast list of all the languages that run on the JVM.  But App Engine does not support the full range of the J2EE API. If you are looking to build any EJB, SOAP services or anything that access the file system, then the App Engine is not for you as it does not support it. To make matter worst, some URL connections are not supported by the Google cloud services. I am looking forward to the day when JBoss Seam will be supported entirely by their services without having to hack the code. Alright, it is not that bad; Google App engine supports a number of Java EE frameworks such as Spring.

Alright people, this was a long introduction but I believe it is worth it. You cannot create web services such as SOAP in the Google App Engine therefore, in this short article, I will show you how to develop a ReST based web services that works with the App Engine using NetBeans and Jersey API. Actually, NetBeans comes with Jersey support out the box. Now, let’s get started.
In order to follow the instructions, you will need the following:

NetBeans 6.7+

Google App Engine SDK

Google App Engine Plugin for NetBeans

I would expect you to know to be familiar with JAVA EE development and Google App Engine development. Once you have all the software and components installed, now we can start.

1.    Create a database structure to store your data. I used MySQL Workbench to design my DB structure (Google App Engine does not used MySQL and does not have plan to support it in the future). This schema is to help you understand the relationship between your entities. You can use any UML tools to design your objects’ relationship. There is a reason I chose to design DB in MySQL, the application generates SQL script which I will upload to MySQL DB. I will show you in the next step the main benefits.
 

2.    Based on your DB schema, create a database. You can use any database you want. I used MySQL to initially store my entities.

3.    Launch your NetBeans application and create new Web Application.
 

3.1    Choose Google App Engine as deployment server, click here to see how to register the App Engine in NetBeans.

4.    Now click on: File -> New File. In the popup window, choose Web Services in the Categories panel and RESTFul Web Services from Database and click next.
 

5.    Choose your the database that you want to generate your Entity classes from and click next. ( in the screenshot I am using the sample DB which came with the NetBeans and JAVA DB server. This is just for illustration purposes only as I have previously generated my Entity Classes from the DB schema I generated in step 1).

6.    Check your Class Names and how they relate to the database. Make any changes that you require in this screen, and then click next.
 

7.    In the next screen, just accept the default values and click finish.
 

Well we have done the hard parts. There is another step that missed out due to the fact that I already written the application. When generating Entities from Database, if you do not have a persistence unit available, it will ask to generate one. Here is more information on how to create a persistence unit with NetBeans. Make sure to make sure to choose “Create” as the table generation strategy.
 

By Now, you should be aware that we have create a back-end application store which we can call using normal http post, get, delete and create. NetBeans RestFull methods allows to use XML or JSON to send data to the services. The responses MIME can be anything you like as long as the application server supports it.
WARNING: JERSEY XML processing is not supported by Google App Engine as it uses JAXB. JAXB accesses API which is forbidden by the App Engine stack. If testing the application on the App engine, use the JSON MIME for your data.
Ok so we have generate the classes and methods require to expose our back-end to other application. As it is, this will not work on Google App Engine so we need to make a final change, this time we need to change the persistence.xml file manually. In the project window under the name of your project, click on Configuration files -> persistence.xml and open the file in the editor. Once the file is open in the editor, click on the XML tab and make the necessary changes to make it look like the screenshot.
 

Here is the content of persistence.xml; modify to reflect your entities and make sure the name of the provider is the same as mine and the properties are exactly the same.

    <?xml version=”1.0″ encoding=”UTF-8″?>
    <persistence version=”1.0″ xmlns=”http://java.sun.com/xml/ns/persistence” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xsi:schemaLocation=”http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd”>
        <persistence-unit name=”ReSTWebApplicationPU” transaction-type=”RESOURCE_LOCAL”>
            <provider>org.datanucleus.store.appengine.jpa.DatastorePersistenceProvider</provider>
            <class>com.test.api.Userlogs</class>
            <class>com.test.api.Customer</class>
            <class>com.test.api.User</class>
            <class>com.test.api.Business</class>
            <class>com.test.api.Reviews</class>
            <exclude-unlisted-classes>false</exclude-unlisted-classes>
            <properties>
                <property name=”datanucleus.NontransactionalRead” value=”true”/>
                <property name=”datanucleus.NontransactionalWrite” value=”true”/>
                <property name=”datanucleus.ConnectionURL” value=”appengine”/>
            </properties>
        </persistence-unit>
    </persistence>

Now we are ready to run the application and test it in our local environment. Right click your project name in the projects window and click Test RestFul web services, your favourite web browser with page that shows your resources. You can test your services by clicking on the nodes and or drilling through the nodes.
 

You can also use the Firefox RESTClient Plugin to test your app. Also, Google App Engine does not fully support JPA and might throw some exception about not supporting “integer type object for primary key”. If you do experience this issue, change the type of your key.
This is it for now, looking forward to your feedback, good or bad, they are all welcome. Also, if you need JAVA EE help, contact me directly. Also join me on facebook. Hope the guide or tutorial or whatever you call it, was good enough to help you with something.

Please support my blog and its advertisers by clicking on the interesting products/ services on the right (Google ads).  Cheers.

4022192579_fd6c772895

Android Mobile Application Development – Budget My Day

The last weeks or so, I have been contemplating on build a mobile application. I wanted to use a technology that I was familiar with and that is also currently available in the market. So Let me describe the factors that affected this weekend project:

  1. The user interface of the application have to be nice a la iphone.
  2. provide RAD tools, I do not have time coding in notepad.
  3. Good documentation and online support (through forums and etc..).
  4. Need to able to learn and develop a nice UI within hours.
  5. The ability to connect and display real-time maps such as Google Maps or Yahoo!Maps.

Ok, first of all I was not trying to develop a mobile webpage, the type of thing you can do with Yahoo! Blueprint. My aim is to later develop this weekend project into a full blown mobile application with other contents such as videos, telephone dialing and messaging features. Based on that, I realized that there are bunch of technologies there which can be separated into two categories:

  1. Vendor neutral: J2ME (JavaFX???) and Android
  2. Vendor (or Manufacturer): iPhone, Symbian and the list is very long.

I therefore decided to take a vendor neutral approach. I have some experience developing J2ME based applications and I can tell that the user interface is not as great as on the iPhone or Android based phones. So what about JavaFX, let me know the next time you see a JavaFX-based phone in a shop window or in public.

So I decided to take the Android route. I know the API were based on JAVA, making the learning curve an easy ride. One thing I don’t understand is; Android has been out for awhile and still do not provide a supported version of their development plugins for NetBeans or any other IDE but Eclipse. Nevertheless, I decided to use NetBeans to the develop the “weekend project”. Android emulator is nice to work but sometimes, i just had to run the application multiple times in order to see the app in the emulator.

I believe that in order to be a good Android developer, you need to familiarize yourself with building the UI through the XML and know your widget from your layouts. Anyway, this is just the start of the project but next time I blog about, I will discuss how to connect to web services to perform user authentication and persist data. For now here is a screenshot of the search tab taken from the emulator.

P.S. The application will be a recommendation engine that will recommend user how to go out and enjoy themselves on any budget. For example, a person with £100 budget can see what to do, where and when. For example, go on a date to cinema then to restaurant before heading out to night club. Users will be able to review and rate recommendations.

Please support my blog and its advertisers by clicking on the interesting products/ services on the right (Google ads).  Cheers.

36_1_37

London Start-ups TechWorld Event April 09

ETAPIX Global Ltd is sponsoring the London Start-ups TechWorld Event on 9 April 2009.

London Start-ups 2.0 TechWorld Event aims to build a platform where technology and Internet start-ups can gather together to exchange knowledge, network and reward innovations through funding partners.

If you are a Start-up company based anywhere on this globe and looking for partners, funding or just the right guys to transform your ideas into the real thing, then this event is for you. This is a chance to showcase your ideas, projects and or company in front of well selected panellists. There is only one catch; your business or idea has to be Tech or Internet related.

Our panellists have several years of investment and knowledge of Tech start-ups and can provide support through financing and workshops for your next BIG thing.

The videos will be broadcast live on USTREAM.com and all exhibitors will have their details forwarded to our network of partners for further promotion.

London Startup 2.0 TechWorld Event

Thursday 9th April 2009 at 18:00-22.30

EXPRESS BY HOLIDAY INN
LONDON – LIMEHOUSE see map

469-475 The Highway
London
E1W 3HN

We booked a conference room in Docklands London. We only have limited places available so hurry and get your ticket now @ http://www.urgoing.to/londonstartups
You can also join our hundreds of members on Facebook LinkedIn Smarta
or follow me on twitter @armelnene for an update.

Ticket : £15.90 (Exhibitors)
Ticket: £10.73 (Networkers and Guests)

TICKETS HAVE TO BE PURCHASED PRIOR TO THE EVENT.

NO TICKETS WILL BE SOLD ON THE DOOR.

Click here to see a list of the panellists including a short biography.

See you there and have your business cards ready.

I am not a geek… you are nerd

PHP vs Ruby vs JAVA for web applications, part 1

This is not a thorough comparison of the languages per se. I was recently contracted as web application consultant for a start-up company. My role was to make a recommendation of which out of PHP, Ruby or JAVA would be more suitable for web applications development. For contractual reason, the web application and its requirements will not be discussed.
When comparing PHP and Ruby against JAVA, their technology evangelists usually make the following point:

PHP and Ruby are dynamic languages compare to JAVA static type feature. So our first debate here will be about; which one is better, static or dynamic language?

Static typing

A programming language is said to use static typing when type checking is performed during compile-time as opposed to run-time. Examples of languages that use static typing include Ada, C, C++, C#, Java, Fortran, ML, Pascal, and Haskell. Static typing is a limited form of program verification (see type safety): accordingly, it allows many errors to be caught early in the development cycle. Program execution may also be made more efficient (i.e. faster or taking reduced memory).

Because they evaluate type information during compilation, and therefore lack type information that is only available at run-time, static type checkers are conservative. They will reject some programs that may be well-behaved at run-time, but that cannot be statically determined to be well-typed. For example, even if an expression <complex test> always evaluates to true at run-time, a program containing the code

if <complex test> then 42 else <type error>

will be rejected as ill-typed, because a static analysis cannot determine that the else branch won’t be taken.

Some statically typed languages have a “loophole” in the programming language specification that enables programmers to write pieces of code that circumvent the default verification performed by a static type checker. For example, Java and most C-style languages have type conversion; such operations may be unsafe at runtime, in that they can cause unwanted behavior due to incorrect typing of values when the program runs.

Dynamic typing

A programming language is said to be dynamically typed, or just ‘dynamic’, when the majority of its type checking is performed at run-time as opposed to at compile-time. Examples of languages that are dynamically typed include Clojure, Groovy, JavaScript, Lisp, Objective-C, Perl, PHP, Prolog, Python, Ruby, and Smalltalk. Dynamic typing can be more flexible than static typing (for example by allowing programs to generate types based on run-time data), since static type checkers may conservatively reject programs that actually have acceptable run-time behaviour.The cost of this additional flexibility is fewer a priori guarantees, since static type checking ensures that type errors will not occur in any possible execution of a program.

Dynamically-typed language systems, compared to their statically-typed cousins, make fewer “compile-time” checks on the source code (but will check, for example, that the program is syntactically correct). Run-time checks can potentially be more sophisticated, since they can use dynamic information as well as any information that was present during compilation. On the other hand, runtime checks only assert that conditions hold in a particular execution of the program, and these checks are repeated for every execution of the program. Static type checkers evaluate only the type information that can be determined at compile time, but are able to verify that the checked conditions hold for all possible executions of the program, which eliminates the need to repeat type checks every time the program is executed.

Development in dynamically-typed languages is often supported by programming practices such as unit testing. Testing is a key practice in professional software development, and is particularly important in dynamically-typed languages. In practice, the testing done to ensure correct program operation can detect a much wider range of errors than static type-checking, but conversely cannot search as comprehensively for the errors that both testing and static type checking are able to detect. Testing can be incorporated into the software build cycle, in which case it can be thought of as a “compile-time” check, in that the program user will not have to manually run such tests.

Static and dynamic type checking in practice

The choice between static and dynamic typing requires trade-offs.

Static typing can find type errors reliably at compile time. This should increase the reliability of the delivered program. However, programmers disagree over how commonly type errors occur, and thus what proportion of those bugs which are written would be caught by static typing. Static typing advocates believe programs are more reliable when they have been well type-checked, while dynamic typing advocates point to distributed code that has proven reliable and to small bug databases. The value of static typing, then, presumably increases as the strength of the type system is increased. Advocates of dependently-typed languages such as Dependent ML and Epigram have suggested that almost all bugs can be considered type errors, if the types used in a program are properly declared by the programmer or correctly inferred by the compiler.[2]

Static typing usually results in compiled code that executes more quickly. When the compiler knows the exact data types that are in use, it can produce optimized machine code. Further, compilers for statically typed languages can find assembler shortcuts more easily. Some dynamically-typed languages such as Common Lisp allow optional type declarations for optimization for this very reason. Static typing makes this pervasive. See optimization.

By contrast, dynamic typing may allow compilers to run more quickly and allow interpreters to dynamically load new code, since changes to source code in dynamically-typed languages may result in less checking to perform and less code to revisit. This too may reduce the edit-compile-test-debug cycle.

Statically-typed languages which lack type inference (such as Java and C) require that programmers declare the types they intend a method or function to use. This can serve as additional documentation for the program, which the compiler will not permit the programmer to ignore or permit to drift out of synchronization. However, a language can be statically typed without requiring type declarations (examples include Haskell, Scala and C#3.0), so this is not a necessary consequence of static typing.

Dynamic typing allows constructs that some static type checking would reject as illegal. For example, eval functions, which execute arbitrary data as code, become possible (however, the typing within that evaluated code might remain static). Furthermore, dynamic typing better accommodates transitional code and prototyping, such as allowing a placeholder data structure (mock object) to be transparently used in place of a full-fledged data structure (usually for the purposes of experimentation and testing). Recent enhancements to statically typed languages (e.g. Haskell Generalized algebraic data types) have allowed eval functions to be written in a type-safe way.[3]

Dynamic typing typically makes metaprogramming more effective and easier to use. For example, C++
templates are typically more cumbersome to write than the equivalent Ruby or Python code. More advanced run-time constructs such as metaclasses and introspection are often more difficult to use in statically-typed languages.

Static typing expands the possibilities for programmatic refactoring. For example, in a statically typed language, analysis of the source code can reveal all callers of a method, and hence a tool can consistently rename the method throughout all of its uses. In dynamic languages, this kind of analysis is either impossible or more difficult because the reference of a name (e.g. a method name) cannot surely be determined until runtime.

It is hard to say which one of the “types” scores more than the other without considering the business context where they will be applied. Much of my development experiences are on large scale enterprise applications, mainly for the finance and telecoms industry. I understand that it is not because of languages, such as JAVA, are statically type that they will end up having fewer bugs. These are part of the reason behind frameworks such as JUnit and Test Driven Development methodologies. When developing critical applications such as online banking system, it is important to rid of minor errors during compilation time then having to write a unit test for that purpose. Unit tests should be focused on application and business logic’s functions test. An example of testing would be to make sure that a customer shall not be able to transfer funds if his accounts balance is less than the amount being transferred.
Dynamic type has its advantages. Let’s take JavaScript as an example, if all you needed was to develop a mortgage calculator program for the web application, then writing it in dynamic language would make perfect sense. This type of application would take less time to write (in terms of line of code) by an experienced Ruby/ PHP developer than, let’s say, in JAVA. Also for the pure fact that JAVA has the abilities of integrating various scripting languages makes a perfect fit for enterprise development. With JRuby, a JAVA developer can integrate Ruby with JAVA by leveraging their current technology investment.
I will a give a thumb-up to JAVA against Ruby. The compiler’s check on object types saves me time on not writing my own type-check. Compiling increases application performances but avoiding checking for types during runtime but the possibilities of getting some type exceptions still exists such as downcasting errors. Overall, JAVA has matured much faster and became a standard in its own. The language specification is constantly being ameliorated and the features implemented are almost second to none compared to Ruby and PHP.

In part2, I will discuss developer productivity and application scalability (+security).