Home > Java, JavaOne > JavaOne day 1

JavaOne day 1

Woke up early morning (7.00 am), ate something quickly and off to Moscone centre for the JavaOne registration. Good thing, Moscone centre is very close to where I am staying. Bad thing, most of the rooms are not theatrical, all seats are on the same level (although I have to admin that you have a clear view wherever you sit). Anyway, I got my pass, got my J1 bag and waited in the lobby till the Community One general session starts (around 9.30 am). So while I am waiting I though I’d blog about it.

Moscone centre is huge, when I say huge I really mean huge, there are three wings, (north, south and west) and all of them are in different blocks. The J1 event takes place in the north and south wing. So far it’s been very well organised, the staff is very polite and always eager to help you, there signs and guides everywhere in the centre so any given time you know where to find your room. It’s full of security people and always checking your pass so I’d guess it’s very tough to get in here if you don’t hold a valid pass.

Community one started at 9.45 am. Ian Murdock started by giving us information about J1. This is the second year that Community One takes place and the overall J1 conference has 50% more attendees (a total of 15.000) than last year. The computing world has evolved a lot in the past decade and the next natural stop is the open source.

Jonathan Swartz was called and he talked about the role of the community and how it has changed the face of computing forever. He went on by saying that the purpose of the community is to create markets and opportunities.

Ian started talking again mentioning that developers and managers should try and learn new things and make connections. The best and most important thing that the community has to offer is innovation. A community, the Java community, is about people, about us. And when people are involved and get passionate there are disagreements. These disagreements are what we should take the best out of it and make them positive. Sun has evolved a lot in the past years, it has progressed, made mistakes and learned from them.

Another integral part of the community is a common set of interests and how we progress from one application to the other, how do we go from Eclipse to NetBeans ot from Linux to Solaris. The answer is open standards. Open standards have enabled us to move from close to open, from proprietary to free.

Ian said that back in 1993 he saw people from all around the world, with different cultures and languages, getting together to and talking about the same things. This was truly remarkable. But you had to get all different things and assemble them together, linux kernel, linux drivers, linux applications etc. Therefore the idea was simple; get all these things, put them together delivere them to the world. Have open source standards and trends; move from monolithic to fine grained applications. this increases flexibility and competition. And increased competition lowers prices.

Linux distributions changed everything. The biggest innovation of Debian was the way development was taking place. It showed to the community how to maintain and distribute the technology with the package installation system. The smaller independent developer could deliver their innovation to the market by using the Debian installer package.

The wad of stuff (speaking about solaris) is a move from monolithic to marginal architecture in open solaris. Sun embraces the same model through the full product line: open source. They provide free and open source, tried and tested production ready solutions. You do not have to pay Sun anything, all is free. But if you need help to upgrade, scale or support then that’s how they make money. It’s a win win situation.

And how open source relates to the new computing world? What does it mean? Ian mentioned that it hides complexity and developers can focus on the actual application and produce ready to market applications. Open solaris is a platform that enables developers to assemble the small pieces (IDE, compilers, drivers, tools etc) they need in order to develop the application they want.

After that Marten Mickos (database from Sun), Jim Zemlin (Linux foundation), Stormy Peters, Ted Leung, Jeremy Allison and Mike Evans went onto the stage and there was a conversation about the products they represent. Marten declared explicitly that MySQL is and will be open source software forever. And there is no exception to that rule.

Other interesting thoughts and ideas are that the open source community needs all the developers, technical writers, testers etc they can get. Originally open source asked people for either code contribution or cash. Now it asks for documentation, blogging, to make the community known. The only enemy of the community is obscurity. By committing code it benefits everybody and companies realised that early enough therefore several of them are assigning people working on open source projects full time.

A point made clear was about Samba. Samba is “just a bunch of guys” and the existence of a Samba commercial company does not break the open source model and they refuse to have corporate contribution in their code base.

In a question about what a community needs in order to be successful the answer is straight forward; you need a strong leader, you need to have the model for the community, and you have to evaluate the community. The open source developers are a different thing to the users. This takes an extra special step because you need to have someone to articulate the vision. Having a charismatic leader that can absorb all feedback and make the right decisions helps a lot because many of these projects are going to change the world.

At some point certain community members will be valued more than others. These decisions are based on the installation base there is and on their contribution to the community, on their dedication.

Sun expects to get excitement and participation from the community. That’s all they ask for. You want the enterprises to use your software and the community participation can help to do that.

Somebody asked if google is a bad or good example in the open source community. The answer is that google is doing open source in their own terms and they deliver software through the web. The good thing about google is that it does not try to own a particular open source project they work on. And this is the right way to engage in the open source world.

In a question what are the top three things Sun can learn from the Apache and Python the answer wasn’t clear (or I wasn’t able to understand it). Between the lines I understood that having the right person in the right place makes a big difference.

The panel closed by saying that there is tension between corporations and organic communities.

Richard Green was the next one on the stage. He said that in Sun everything they do is about the rock starts, and the rock stars are the community. The made bits of solaris available to the community and that was a great start. But open source is not about the bits but about the whole picture. But before you go on you have to think how the model works and what are the amendments and changes you need to make to the model in order for the whole thing work. They made a lot of progress with the help of many people around the world. They added new features and make the network a computer in order to support the whole ecosystem so more people can contribute.

They announced the first fully supported release of open solaris. It’s the centre of gravity of a whole ecosystem and includes features such as Iscsi, ZFS, containers, fma, virtualisation, dtrace, cifs, clearview, hypervisor, device detection tool, d-light, IPS, liveusb, Mysql, ruby, php, apache, gnome, and other oss projects.

Then they demonstrated open solaris from a live CD as well as a d-trace (which was first introduced in Solaris 10 and now it has been spread to other projects as well like Java, Ruby and Firefox 3) demonstration.

Jim (CTE of Solaris organisation) did a demo of a system with several hard disks plugged into running OpenSolaris. They literally smashed one of the hard drives (by using an anvil and a hammer) and the system kept on as nothing had happened. They destroyed a second hard drive and, again, the system kept on as nothing has changed. OpenSolaris was unable to identify the failed disks and continue by using the rest of them. Failed disks were replicated on the spot by brand new ones. They were just plugged into, the operating system recognised them, used ZFS to replicate the missing data form the remaining disks onto the new ones and continue as normal. Seamless integration and very useful if you depend on critical data.

Next David Stewart (engineering manager, Intel corporation) went and talked about how they make sure that the Intel chips are well suited for OpenSolaris.

Open source tools for optimising your development process

Many developers think that the goal of the software development team is to build build software within time, scope and budget. But the real goal is to build the best possible application within time and budget constraints. We need to build a higher quality, more flexible and more useful software, that should correspond to what the users want.

The traditional approach results in poorly tested and inflexible (difficult to maintain and extend) code as well as difficult integration phases, bad coding standards and programming habits. In most of the cases the documentation is also out of date as developers tend to forget to update the documentation when they update the project.

An improved approach is to use newer techniques such as better build scripts, better dependency management and good testing practices. Automating the building process and continuous integration always help since the code quality is checked automatically, we end up with a tighter issue tracking system and, to some extend, we can have automatic technical documentation.

The build scripts are the cornerstone of a good software development since they make builds re-produceable, portable and they automate the building process. Two are the most known tools for this job, Ant and Maven 2.

Ant has several advantages; it is know and widely used, it’s powerful and flexible. But on the other hand it needs loads of low level code.

Maven 2 uses a declarative build script framework which allows the developer to describe the application (what we want to do) and maven figures out how to do it. It offers higher level scripting, strong use of standards and conventions, loads of plugins, “convention over configuration” and good reporting features. But Maven can be more rigid than Ant (if a project does weird or complicated things Ant will be better).

Maven has a standard directory structure and standard life cycle (you start from declaring resources, then compile, then test-compile etc), has declarative transitive dependency management and good support for multi-module projects.

Maven also has better dependency management. It’s very common that a Java application needs jar files and libraries in order to work. These jar files in turn need jar files themselves. For every library you use it’s more than likely that you need a whole stuff of other libraries.

The traditional approach is that these jar files are stored locally. If each project has its own jar files then it’s very hard to keep track of what version each application is using. Also duplication of jar files/libraries is very likely, you might get errors due to incompatibility of jars and you might also overload the source code repository.

All these issues are solved by declarative dependency management; versioned jar files are stored on a central server. Each project declares the version of jars it needs and it gets the relevant jars from the central server.

Maven 2 has built in declarative dependency management functionality, rich public repositories and if you want better performance you can install local enterprise level repository. Of course what you can do with Maven you can also do with Ivy for Ant. Ivy provides maven-style dependencies for Ant, it’s a bit more powerful than maven but also a bit more complicated to set up.

The cornerstone of development is unit testing. It ensures that the code behaves as expected and it makes the code more flexible and easier to maintain. It also helps detect regressions early enough in the development life cycle and document the code. The drawback here is that you have more code to write and maintain but in exchange you get more reliable code with less bugs which is also easer to maintain.

The latest version of JUnit, JUnit4.4, provides many features that make writing tests easier and more productive, like annotations, annotations for testing timeouts and exceptions, parameterised tests and theories for better test coverage. There are a few differences between JUnit 3 and JUnit 4.

JUnit 3

  • you needed to extend the TestCase class
  • you use the setUp() and tearDown() methods
  • your method should start with “test”

JUnit 4

  • any class can contain tests.
  • you have annotations like @before, @after etc
  • you test by using the @Test annotation,
  • you can test timeout, exceptions and have parameterised tests

With JUnit 4.4. you also get hamcrest asserts which are a more reliable way of writing assertions. Trditionally with JUnit you use the “assertTrue” or “assertEquals” methods but with hamcrest you use the “assertThat” thus making the code more readable. You also get more readable and informative error messages and you can combine combine constraints by using the “not()” method.

There are also several test coverage tools that help you write better unit tests and show how much of your code is being executed. These test coverage tools can be integrated in the build process (like cobertura which can run with every build) or they can be integrated in the IDE (like ECLEmma for Eclipse – Netbeans 6.1 already has a test coverage plugin integrated, crap4j and many more). These tools are way more convenient for the developer than using an HTML report

Another technique developers can use is continuous integration which needs to be done alongside with testing. Continuous integration integrates and compiles code from different developers on a central build server. Having several developers commit code in a central server is the most common and widely used practice in modern programming. Therefore continuous integration is a core best practice of modern software development

In order to do continuous integration we need automated build process (ant, maven, make files), automated test process (junit, testng), a source code repository (cvs, svn, starteam etc) and a continous integration build tool (cruisecontrol, continuum, hudson, lunt build etc). If there are any issues with continuous integration the server will notify the developers (mainly via e-mails) of the problem. With continuous integration we get better and more flexible code because

  • we do regular commits (at least once a day)
  • automatic builds and reporting
  • regular releases and more regular testing
  • less bugs
  • faster bug fixes

Another way to get better code quality is to enforce coding standards. By enforcing coding standards we get better quality code and code that is easier to maintain and to detect potential bugs. It’s also easier to train new staff since everyone is following the same guidelines.

Code quality can also be enforced by doing manual code reviews, although they are not done very systematically and can be slow and time consuming. Automatic code audits on the other hand are easier to be done and they are done on a regular basis. There are several tools that help

  • checkstyle (coding standards, naming conventions, indentation, javadocs)
  • PMD (best practices empty try/catch/finally blocks, null pointer checks, complex methods etc, a bit harder to use than checkstyle but very informative)
  • find bugs (potential defects, potential null pointer exceptions, fields that could be modified where they shouldn’t be, infinite loops etc)
  • crap4j (overly complex and poorly tested classes, uses test complexity techniques)

Automated documentation can also help as opposed to manual documentation which is written once and then it’s forgotten. Automatic documentation is complete, always up to date and cheap to produce but it lacks “higher vision” (tends to be a bit dry and not very usable). The simplest thing in order to enforce automatic documentation is to try to get developers write Javadoc comments.

Q&A

Are there any os tools for testing web applications? There is Selenium, canoo web test, http unit. Selenium is very useful since it’s using the web browser.

Will JUnit 4 run JUnit 3 tests? Yes

What about EJB testing? The problem is that you have to deploy the EJBs before you can run the tests. Write the EJBs as POJOs first in order to step the functionality. Alternatively use tools like MockEJB.

How to get new people to write unit tests? New people don’t like unit testing. The best way to get them into unit testing is to pair with experienced programmers.

JavaFX – Mezzanine room 236

This was a session I didn’t have in my schedule. We got into Mezzanine room 236 and there were announcements and discussions about where JavaFX is going.

The session started by explained what exactly FX is; a RIA (Rich Internet Application), a way to enhance the Java ecosystem and to expand the universe of people who are working with Java. FX’s aim is to create new things that Java is and can do. FX is Java everywhere. It’s happening, you can actually build Java applications that can run anywhere and everywhere, be it desktop, browser or mobile devices. And this is still part of Java, build on Java and run on Java. The whole FX investment is in Java.

With FX one gets better browser and desktop support. In a question “how do we meet the needs of what people want” the answer is that all system will eventually be built in Java Fx using different modules and therefore they can run literally anywhere there is a JVM installed, since JavaFX can be compiled into bytecode.

What about JavaFX on Mac? Apple’s future is very focused to not saying or doing anything unless a product is released. Sun is having conversations with Apple at multiple levels, they are conversations that are happening. Support of Java on Mac is of highest priority for Sun, but we have to bear in mind that although Sun understands developers this is not entirely on Sun’s hands.

In a question “What are the biggest risks with JavaFx right now?” the answer is “Delivery”. Sun has to hit the time line and the deadlines they have for FX delivery and they are working hard on it since they want to deliver a JavaFX implementation that can live up to the expectations.

With JavaFX Sun is trying to create a common solution for the billions of devices with Java. Across every place you have Java and they are trying to find a common and unified solution that can fit all.

In a question “What developer problem are you solving with JavaFX?” the answer is that JavaFX can save time developing applications. It is easy to expand it and due to being a Java solution you can actually use existing multimedia libraries and applications.

An interesting thing mentioned is that Sun will choose to deliver different layers of JavaFX on mobile phones. This means that there will be different profiles (a profile can be thought of as a different level of capability). This should not affect the JavaFX applications, at least not for the most of them. But it will affect devices with weaker processing power. If you have for example a cheap phone that does not fully support the JavaFX capabilities you won’t be able to get the full potential of JavaFX.

JavaFX can be thought of as an engine that drives all the capabilities that are exposed to the Java runtime. It is different to Flex. With Flex you have to create a flash component (so you need a Flash designer) and then call it from Flex. With JavaFX you call and create the objects directly.

JUG Leaders – Think Globally, Act Locally

Due to the JavaFX session and a misleading data on the Java Sunspot site I was a bit late for the JUG leader session, so I got there half an hour after it had started. Sorry I couldn’t make it on time guys. Therefore I didn’t even keep a log of what was said during the session.

NetBeans platforms success stories

This was a session about the NetBeans platform and how it can be used to build RIA.

Netbeans platform is a modular system. It’s a modular system because it’s easier to understand the modules rather than a monolithic system with spaghetti code. It’s a platform for Java applications which is open source, written in pure java (so you can reuse all the code you had written before), stable and mature (it’s been around for a long time, longer than seven years) and you can also call it RCP (Rich Client Platform).

The platform, offers several advantages including a windowing system, inherent build scripts, declarative configuration (with an XML file where you can include and exclude features), auto-update, ability to reuse any IDE features, modules that make common tasks easy (dialogs, file io, threading and progress notifications, support for custom project/files etc).

Then a nice demo application of creating a simple platform application using NetBeans was demonstrated.

Another demo of a RCI application followed.

Another demo followed, this time it was the Blue Marine application which, I have to admit, was quite impressive. The overall look and feel reminded me of the Azureus Vuze layer; same colors, same effects etc, but this one was written using the NetBeans platform.

Blue Marine started in 2003 as a Swing application. But in 2005 it was completely rewritten from scratch by using the NetBeans platform. In this demo application one could choose several photos of a library and manipulate them (make brighter, scale etc). then these photos could be added as pins on a global map. So you could actually visit places and pin down photos you have taken from these places. As I said, this was a very impressive demo and shows the full potential and capabilities of the NetBeans platform. It also reminded me a lot of the Parleys demo I saw last year in javoxx.

If you want to learn more about the platform you can visit the following addresses

  • platform.netbeans.org
  • Rich client programming book
  • Fabrizio’s javaone presentation (TS 5483)
  • Tom’s javaOne presentation (TS 5541)

If you are looking for training you can have professional training by the NetBeans experts, it covers

  • developing on the NetBeans platform from the ground up
  • offers several levels of Netbeans Platform Certification
  • community trainings around the world
  • customised version of the course also available through Sun learning centre
  • visit http://edu.netbeans.org
  • Send an e-mail to users@edu.netbeans.org if you want to become a certified netbeans engineer.

Q&A

Jgoodies with netbeans paltform – the platform integrates well with jgoodies although the speakers have not used all of jgoodies features.

NetBeans IDE, lightning talks. Cool stuff and more with James Gosling

This was a session with James Gosling and guests talking for approximately ten minutes (each guest) about the NetBeans paltform.

Adam Myatt is the author of the pro-netbeans IDE 6 book. He said that his favourite netbeans feature is the out of the box functionality, the netbeans profiler and how easy is to measure and profile an application.

Dr B.V. Kumar, author of Delivering SOA using the Java Enterprise Edition Platform with co-authors Prakash Narayan and Tony Ng). He likes the NetBeans easiness with which you can create SOA services and clients and how you can hook them up to different disparate services.

Brouno Soouza and Tom (sorry didn’t catch the full name) gave a speach about Sun’s NetBean’s development programme. Sun gives money to six communities to do open source projects. The total amount given is $175.000 divided into twenty different projects (ten big and ten small). They have hundreds of submissions that they have to go through and choose the best. Only twenty make it at the end. In order to get the money they have to finsih and deliver the projects. You can visit netbeans.org/grant for the winners.

Chris Palmer from Oracle developed the Learning 360 described as “ERP for education” which is essentally a NetBeans learning application that uses the Visual Library. It can support more than 100.000 concurrent users, mainly tutors, students and parents. Learning 360 provides the same sort of environment for students and teachers alike. After five minutes Chris showed us a demo of the application  and he said he chose the visual library because they were running out of time and it gave them a stable container for all the things they needed to do (dragging, drawing etc).

Mark from dotFX Inc talked about secured rich internet application based on Java. Real live software enables deployment of secure RIA using ordinary Java. By using transparent runtime services they managed to solve long standing software problems like changes in the software life cycle (versioning/update problems, vendor lock-in etc). doxTF comes as a free NetBeans plugin.

When you run the application using dotFX it will actually run on a very functional sandbox and the user doesn’t have to do anything.

Advertisements
Categories: Java, JavaOne
  1. 6 May 2008 at 7:22 am

    Nice Site layout for your blog. I am looking forward to reading more from you.

    Tom Humes

  2. 6 May 2008 at 9:13 am

    excellent cover up Pano…good job mate! Hey man dont forget my J1 pack! thaanks much love, Namaste!

  3. 6 May 2008 at 5:52 pm

    I got you some NetBeans stuff already 🙂 Ah, and we also got a SunSpot 😉

  4. 7 May 2008 at 8:51 am

    please please dont forget the Pack! pleaaase!

  5. 7 May 2008 at 4:44 pm

    Nope, I haven’t.

  6. 8 May 2008 at 12:41 am

    > Nice Site layout for your blog. I am looking forward to reading more from you.

    Thank you Tom 🙂

  7. John Kostaras
    19 May 2008 at 11:55 am

    So, I don’t need to attend neither JavaOne nor Javoxx conferences from now on. I can learn everything from this blog! :-p

  8. 19 May 2008 at 12:01 pm

    Hehe John, the truth is that I wrote a lot!

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: