Book Review: Enterprise Java Microservices by Ken Finnigan

19 December 2018

Enterprise Java Microservices cover

General information

  • Pages: 245
  • Published by: Manning
  • Release date: nov 2018

Disclaimer: I received this book as a collaboration with Manning and the author.

Some context on learning Microservices

Let's clarify one thing, switching from Monoliths to Microservices is not like switching from traditional JavaEE to Spring (and viceversa), is more like learning to develop software again because you have to learn:

  • New architectural patterns
  • How to work with distributed transactions
  • How to guarantee consistency in distributed data (or not)
  • New plumbing/management tools like Docker, Kubernets, Pipelines
  • New frameworks and/or old frameworks with new Cloud Native capabilities

Despite really good publications like the 12 factors for web native applications. From my perspective the real challenge on learning Microservices is the lack of a central documentation/tutorial/from 0 to wow guide.

In the old days, if you wanted to learn enterprise software development on Java, you simply took the Java EE tutorial from 0 to 100, maybe a second tutorial like Ticket Monster, a couple of reference books and that's it. However, learning Microservices is more like:

  • Learning new cloud native capabilities of old frameworks in documentations that take for granted your knowledge of previous versions of the framework
  • Or learning new frameworks with words like reactive, functional, non-blocking, bazinga
  • Learning new design patterns without knowing the real motivation of each pattern
  • Surfing in a lot of reddit and StackOverflow discussions on not-so clear explanations
  • Learning Docker and Kubernets because every body is doing it
  • Hitting end roads because you are accustomed to things like JNDI, JTA, Stateful sessions

About the book

Do you see the problem? The lack of a cohesive guide that explains who, how and why you need the zillion of new tools makes the path of learning Microservices an uphill battle, and that's where Enterprise Java Microservices shines.

Enterprise Java Microservices is one of the first attempts to create a cohesive guide with special focus on Java EE developers and MicroProfile. The book is divided in two main sections, the first one focused on Microservice basics (my favorite) and the second one focused on Microservices implementations with specific libraries (being a little bit biased for Red Hat Solutions).

First section

For those that already have some Enterprise Java knowledge but don't have too much idea about the new challenges in the Microservice world this will be the most useful section, with your actual knowledge you should be able to understand:

  • The main differences between monoliths and microservices architectural styles and patterns, this is by far the most important section for me and gave me a couple of new ideas debunking some misconceptions that I had
  • Just Enough and Micro application servers/frameworks
  • New jargon on the cloud native world (yup you have to learn a lot of new acronyms)
  • General knowledge on testing Microservices
  • Why do you need new tools like service discovery, service registry, gateways, etc.
  • Principles of reactive applications

This section also includes a brief introduction to API/Rest Web Services creation with Java EE and Spring, however I'm not sure about the feeling of this section for a newcomer.

Second section

Once you took the basis from the first section (or from bad implementations and end roads in the real life like me), this section will cover three of the most complex and important topics in Microservices, being:

  1. Service discovery and load balancing
  2. Fault tolerance (circuit breakers, bulkheads, etc.)
  3. Strategies for management and monitoring microservices

The chapters of the second section are structured in a way that makes easier to understand why do you need the "topic" that you are about to learn, however and despite being early focused on MicroProfile, it will use specific libraries like Hystrix, Ribbon and Feing (yes, like in Spring Boot) for the implementation of these patterns, adding also some specific integrations with Thorntail.

Later on, the book evolves over a sample application, covering topics like security, architectures with Docker/Kubernets and data streams processing, where as I said a particular bias is present, implementing the solutions by using Minishift, Keycloack and other Red Hat specifics.

To clarify, I don't think that this bias is bad, considering that there are (as I said) a zillion of alternatives for those factors not covered by Java, I think it was the right decision.

Things that could be better

As any review this is the most difficult section to write, but I think that a second edition should include:

  • MicroProfile fault tolerance
  • MicroProfile type safe clients

Since Ken is one of the guys behind MicroProfile implementation in Red Hat, I'm guessing that the main motivation for not including it is simple, MicroProfile is evolving so fast that these standards were presented after book printing.

People looking for data consistency patterns (CQRS, Event Sourcing, eventual consistency, etc.) could find the streaming processing section a little smaller than it should be.

Who should read this book?

  • Java developers with strong foundations, or experience in Spring or Java EE

Yes, despite their coverage of basic APIs, I think that you need foundations like those covered in OCP certification, otherwise you will be confused in the implementation of annotations, callbacks, declarative and imperative code.

Eclipse MicroProfile Metrics, practical use cases

31 October 2018


At the end of 2000 decade, one of the main motivations for DevOps creation was the (relatively speaking) neglected interaction between development, QA and operations teams.

In this line, the "DevOps promise" could be summarized as the union between culture, practices and tools to improve software quality and, at the same time, to speed up software development teams in terms of time to market. Winning in many successful implementations colateral effects like scalability, stability, security and development speed.

In line with diverse opinions, to implement a successful DevOps culture IT teams need to implement practices like:

  • Continuos integration (CI)
  • Continuos delivery (CD)
  • MicroServices
  • Infrastructure as code
  • Communication and collaboration
  • Monitoring and metrics

In this world of buzzwords it's indeed difficult to identify the DevOps maturity state without losing the final goal: To create applications that generate value for the customer in short periods of time.

In this post, we will discuss about monitoring and metrics in the Java Enterprise World, specially how the new DevOps practices impact the architectural decision at the technology selection phase.

Metrics in Java monoliths

If traditional architectures are considered, monolithic applications hold common characteristics like:

  1. Execution over servlet containers and application servers, with the JVM being a long running process.
  2. Ideally, these containers are never rebooted, or reboots are allowed on planned maintenance Windows.
  3. An application could compromise the integrity of the entire monolith under several conditions like bad code, bad deployments and server issues.

Without considering any Java framework, the deployment structure will be similar to the figure. In here we observe that applications are distributed by using .war or .jar files, being collected in .ear files for management purposes. In these architectures applications are created as modules, and separated considering its business objectives.

Monolith deployment

If a need to scale appears, there are basically two options, 1- to scale server resources (vertical) and 2- to add more servers with an application copy to distribute clients using a load balancer. It is worth to notice that new nodes tend to be also long running processes, since any application server reboot implies a considerable amount of time that is directly proportional to the quantity of applications that have been deployed.

Hence, the decision to provision (or not) a new application server often is a combined decision between development and operations teams, and with this you have the following options to monitor the state of your applications and application server:

  1. Vendor or vendor-neutral telemetric APIs -e.g Jookla, Glassfish REST Metrics-
  2. JMX monitoring through specific tools and ports -e.g. VisualVM, Mission Control-
  3. "Shell wrangling" with tail, sed, cat, top and htop

It should be noticed also, in this kind of deployments it's necessary to obtain metrics from application and server, creating a complex scenario for status monitoring.

The rule of thumb for this scenarios is often to choose telemetric/own APIs to monitor application state and JMX/Logs for a deeper analysis of runtime situation, again presenting more questions:

  • Which telemetric API should I choose? Do I need a specific format?
  • Server's telemetry would be sufficient? How metrics will be processed?
  • How do I got access to JMX in my deployments if I'm a Containers/PaaS user?

Reactive applications

One of the (not so) recent approaches to improve users experience is to implement reactive systems with the defined principles of the Reactive Manifesto.

Reactive manifesto

In short, a reactive system is a system capable of:

  • Being directed/activated by messages often processed asynchronously
  • Presents resilience by failing partially, without compromising all the system
  • Is elastic to provision and halt modules and resources on demand, having a direct impact in resources billing
  • The final result is a responsive system for the user

Despite not being discussed so often, reactive architectures have a direct impact on metrics. With dynamic provisioning you won't have long running processes to attach and save metrics, and additionally the services are switching ip addresses depending on clients demand.

Metrics in Java Microservices architectures

Without considering any particular framework or library, the reactive architectural style puts as strong suggestion the usage of containers and/or Microservice, specially for resilience and elasticity:

Microservices deployment

In a traditional Microservices architecture we observe that services are basically short-lived "workers", which could have clones reacting to changes on clients demand. These environments are often orchestrated with tools like Docker Swarm or Kubernetes, hence each service is responsable of register themself to a service registry acting as a directory. Being the registry, the ideal source for any metric tool to read services location, pulling and saving the correspondent metrics.

Metrics with Java EE and Eclipse MicroProfile

Despite the early efforts on the EE space like an administrative API with metrics, the need of a formal standard for metrics became mandatory due Microservices popularization. Being Dropwizard Metrics one of the pioneers to cover the need of a telemetric toolkit with specific CDI extensions.

In this line, the MicroProfile project has included among its recent versions(1.4 and 2.0) support for Healthcheck and state Metrics. Many of the current DropWizard users would notice that annotations are similar if not the same. In fact MicroProfile annotations are based directly on DropWizard's API 3.2.3.

To differentiate between concepts, Healtheck API is in charge of answering a simple question "Is the service running and how well is it doing it?" and are targeted for orchestrations. On the other side Metrics present instant or periodical metrics on how services are reacting over consumers requests.

Latest version of MicroProfile (2.0) includes support for Metrics 1.1, including:

  • Counted
  • Gauge
  • Metered
  • Timed
  • Histogram

So, it is worth to give them a try with practical use cases.

Metrics with Payara and Eclipse MicroProfile

For the test we will use an application composed by two microservices, as described in the diagram:

Arquitectura tests

Our scenario includes two microservices, OmdbService focused on information retrieving from OMDB to obtain up to date movie information and MovieService aimed to obtain movie information from a relational database and mix it with OMDB plot. Projects code is available at GitHub.

To activate support for MicroProfile 2.0 two things are needed, 1- the right dependency on pom.xml and 2- to deploy/run our application over a MicroProfile compatible implementation, like Payara Micro.


MicroProfile uses a basic convention in regards of metrics, presenting three levels:

  1. Base: Mandatory Metrics for all MicroProfile implementations, located at /metrics/base
  2. Application: Custom metrics exposed by the developer, located at /metrics/application
  3. Vendor: Any MicroProfile implementation could implement its own variant, located at /metrics/vendor

Depending on requests header, metrics will be available in JSON or OpenMetrics format. The last one popularized by Prometheus, a Cloud Native Computing Foundation project.

Practical use cases

So far, we've established that:

  1. You could monitor your Java application by using telemetric APIs and JMX
  2. Reactive applications present new challenges, specially due microservices dynamic and short running nature
  3. JMX is sometimes difficult to implement on container/PaaS based deployments
  4. MicroProfile Metrics is a new proposal for Java(Jakarta) EE environments, working indistinctly for monoliths and microservices architectures

In this post we present a couple of cases discussed also at Oracle Code One:

Case 0: Telemetry from JVM

To give a quick look on MicroProfile metrics, it is enough to boot a MicroProfile compliant app server/microservice framework with any deployment. Since Payara Micro is compatible with Microprofile, metrics will be available from the beginning at http://localhost:8080/metrics/base.

Base metrics

You could switch the Accept request header in order to obtain JSON format, in curl for instance:

curl -H "Accept: application/json" http://localhost:8080/metrics/base

JSON metrics

By itself metrics are just an up to date snapshot about platform state. If you wanna compare these snapshots over time, metrics should be retrieved on constant periods of time . . . or you could integrate prometheus which already does it for you. In here I demonstrate some useful querys for JVM state, including heap state, cpu utilization and GC executions:


Heap metrics


CPU metrics


GC metrics

Case 1: Metrics for Microservices

In a regular and "full tolerant" request from one microservice to another your communication flow will go through the following decisions:

Metrics fallback patterns

  1. To use or not a cache to attend the request
  2. To cut the communication (circuit breaker) and execute a fallback method if a metric threshold has been reached
  3. To reject the request if a bulkhead has been exhausted and execute a fallback method instead
  4. To execute a fallback method if the execution reached a failed state

Many of the caveats on developing Microservices come from the fact that you are dealing with distributed computation, hence you should include new patterns that already depend on metrics. If metrics are being generated, with exposure you will gain data for improvements, diagnosis and issue management.

Case 1.1: Counted to retrieve failed hits

The first metric to implement will be Counted, a pretty simple one actually. Its main objective is to increment/decrement its value over time. In this use case the metric is counting how many times the service reached the fallBack alternative by injecting it directly on a JAX-RS service:

Counter failedQueries;
@Fallback(fallbackMethod = "findByIdFallBack")
public Response findById(@PathParam("id") 
final String imdbId) {

public Response findByIdFallBack(@PathParam("id") 
final String imdbId) {

After simulating a couple of failed queries over OMDB database (no internet :) ) the metric application:com_nabenik_omdb_rest_omdb_endpoint_failed_queries shows how many times my service has invoked the fallback method:

Counted metrics

Case 1.2: Gauge to create your own metric

Although you could depend on simple counters to describe the state of any given service. With gauge you could create your own metric . . . like a dummy metric to display 100 or 50 depending on odd/even random number:

@Gauge(unit = "ExternalDatabases", name = "movieDatabases", absolute = true)
public long getDatabases() {
	int number = (int)(Math.random() * 100);
	int criteria = number % 2;
	if(criteria == 0) {
		return 100;
	}else {
		return 50;

Again, you could search for the metric at prometheus, specifically application:movie_databases_ExternalDatabases

Gauge metrics

Case 1.3: Metered to analyze request totals

Are you charging your API per request? Don't worry you could measure the usage rate with @Metered.

@Metered(name = "moviesRetrieved",
	unit = MetricUnits.MINUTES,
	description = "Metrics to monitor movies",
	absolute = true)
public Response findExpandedById(@PathParam("id") final Long id) 

In this practical use case 500 +/- requests where simulated over a one minute period. As you could observe from the metric application:movies_retrieved_total the stress test from JMeter and Prometheus show the same information:

Stress metrics
Metered metrics

Case 1.4: Timed to analyze your response performance

If used properly, @Timed will give you information about requests performance over time units.

@Timed(name = "moviesDelay",
	description = "Metrics to monitor the time for movies retrieval",
	unit = MetricUnits.MINUTES,
	absolute = true)
public Response findExpandedById(@PathParam("id") final Long id)

By retrieving the metric application:movies_delay_rate_per_second its observable that requests take more time to complete at the end of the stress test (as expected with more traffic, less bandwidth and more time to answer):

Timed metrics

Case 1.5: Histogram to accumulate useful information

As described in Wikipedia, an Histogram is an accurate representation of the distribution of numerical data. Hence we could create our own distribution by manipulating directly the metrics API with any given data, like global attendees:

MetricRegistry registry;

public Response addAttendees(@PathParam("attendees") Long attendees) {
	Metadata metadata =
	new Metadata("matrix attendees",
	Histogram histogram = 
	return Response.ok().build();

And as any distribution, we could get the mins, max, average and values per quantile:

Histogram metrics

Book Review: Best Developer Job Ever! by Bruno Souza

22 August 2018

Best Developer Job Ever! cover

General information

  • Pages: 95
  • Published by: Amazon Digital Services LLC
  • Release date: july 2018

About the author

I should start this review with a personal comment about the author, Bruno as many Java developers know is one of the most influential leaders in the Java community, being in my case one of the people that adviced me to create a Duke's Choice Award winner project. Hence when he announced the release of his developers advisory book, It went directly to my reading list.

About the book

Different from some motivational books that I've read, this book is focused in just one topic with a practical approach:

How do I improve my professional carreer as software developer?

For a regular book reader it will be a very short book, it took me 3 hours to complete the book from start to end.

The book is divided in five main sections:

  1. Get clarity on your strengths
  2. Define your objectives
  3. (How to) Expand your network
  4. Promote yourself (in the right way)
  5. Get quality interviews

From reading just the section titles, readers could be tempted to guess that this book is focused on boosting any professional carreer, however each section and advice is tailored to IT following an evolutive approach:

  • Any advice starts with an argumentative step, presenting diverse paths to reach the same professional objective in IT and specially in software development
  • After argumentation and based in author's real world experience, some advices are presented to choose the best path and take advantage of it in a productive way
  • Finally, some tools, tips and techniques are presented and complemented with various success histories to validate the tips (whick I must say, I know them to be true)

Room for improvement

The main caveat for this book is that isn't available in other languages. I also noticed that despite presenting many usefull advices, it lacks of some diagrams to re-read the tips and act also as a developers carreer manual.

For the non-casual readers, the writting style could be interpreted as too informal or not so literary. As is, the book is written as an informal conversation between peers and it's a little bit repetitive while trying to emphasize some important points. It depends on readers background.

Who should read this book?

  • Any IT professional, specialy software developers
  • IT recruiters, it will give serious advices of how software development world works

To finish the review, I think that the following excerpt contains the spirit of the book

People can be givers or takers. The givers come in and bring things others can benefit from while the takers take more than they give and ultimately drag the network down by trying to benefit themselves. That’s why givers tend to grow more than takers or matchers.

Notes on Java EE support for NetBeans 9

30 July 2018

Today one of my favourite open source projects got a major release, now under Apache Foundation, welcome back NetBeans!.

In this line, I think that the most frequent question since beta release is:

What about Java EE/C++/PHP/JavaME . . .? You name it

Quick response:

First source code donation to Apache includes only base NetBeans platform modules plus Java SE support

Long response:

Please see Apache Foundation official statement.

Does it mean that I won't be able to develop my Java EE application on NetBeans 9

Short answer: No

Long answer: Currently Oracle already did a second donation, where most of NetBeans modules considered as external are included, as Apache statement suggests we could expect these modules on future NetBeans releases.

Is it possible to enable Java EE support in NetBeans 9?

Considering that NetBeans has been modular since . . . ever, we could expect support for old modules in the new NetBeans version. As a matter of fact, this is the official approach to enable Java EE support on NetBeans 9, by using kits.

Hence I've prepared a small tutorial to achieve this. This tutorial is focused on MacOS but steps should be exactly the same for Linux and Windows. To show some caveats, I've tested two app server over Java 8 and Java 10.

Downloading NetBeans 9.0

First, you should download NetBeans package from official Apache Mirrors, at this time distributions are only available as .zip files.

NetBeans 9 Download

After download, just uncompress the .zip file


You should find a netbeans executable at bin/ directory, for Unix:

cd netbeans

Whit this you would be able to run NetBeans 9. By default, NetBeans will run on the most up-to date JVM available at system.

NetBeans 9

Enabling Java EE support

To install Java EE support you should enable also NetBeans 8.2 update center repository.

First go to Tools > Plugins > Settings.

Second, add a new update repository:

NetBeans 8.2 update center

NetBeans 8.2 update center

Third, search for new plugins with the keyword "Kit", as the name suggests, these are plugins collections for specific purposes

NetBeans 8.2 update center

From experience I do recommend the following plugins:

  • HTML5 Kit
  • JSF
  • SOAP Web Services
  • EJB and EAR
  • RESTful Web Services
  • Java EE Base

Restart the IDE and you're ready to develop apps with Java EE :).

Test 1: Wildfly 13

To test NetBeans setup, I added a new application server and ran a recent Java EE 8 REST-CRUD application, from recent jEspañol presentation (in Spanish).

You have to select WildFly Application Server

WildFly 13

As WildFly release notes suggests if you wanna Java EE 8 support, you should choose standalone-ee8.xml as domain configuration.
WildFly 13

Domain configuration will be detected by NetBeans 9

WildFly 13

WildFly team has been working on Java 9 and 10 compatibility, hence application ran as expected delivering new records from in-memory database.

WildFly 13

Test 2: Glassfish 5 and Payara 5 on Java 10 (NetBeans) and Java 8 (App server platform)

To test vanilla experience, I tried to connect Payara and Glassfish 5 app server, as in the case of WildFly, configuration is pretty straight forward:

You have to select Payara Application Server
Payara 5

Domain 1 default configuration should be ok
Payara 5

Since Payara and Glassfish only support Java 8 (Java 11 support is on the roadmap) you have to create a new platform with Java 8. Go to Tools -> Java Platforms and click on Add Platform
Payara 5

Select a new Java SE Platform

Payara 5

Pick the home directory for Java 8

Payara 5

Finally, go to server properties and change Java Platform
Payara 5

At this time, it seem that NetBeans should be running on Java 8 too, otherwhise you won't be able to retrieve server's configuration and logs, there is a similar report on Eclipse Plugin.

Payara 5

Test 3: Glassfish 5 and Payara 5 on Java 8 (NetBeans) and Java 8 (App server platform)

Finally, I configured NetBeans to use JDK 8 as NetBeans JDK, for this, you sould edit etc/netbeans.conf file and point the netbeans_jdkhome variable to JDK 8, since I'm using jenv to manage JVM environments the right value is netbeans_jdkhome="/Users/tuxtor/.jenv/versions/1.8"

With this NetBeans 9 is able to run Payara 5 and Glassfish 5 as expected:

Payara 5

I'm Still not sure about TomEE, OpenLiberty, WebSphere and WebLogic, but it seems like it would be a matter of hacking a litle bit on JDK versions.

Long live to NetBeans and Jakarta EE!

Testing JavaEE backward and forward compatibility with Servlet, JAX-RS, Batch and Eclipse MicroProfile

18 June 2018

One of the most interesting concepts that made Java EE (and Java) appealing for the enterprise is its great backward compatibility, ensuring that years of investment in R&D could be reused in future developments.

Neverthless one of the least understood facts is that Java EE in the end is a set of curated APIs that could be extendend and improved with additional EE-based APIs -e.g Eclipse MicroProfile, DeltaSpike- and vendor-specific improvements -e.g. Hazelcast on Payara, Infinispan on Wildfly-.

In this article I'll try to elaborate a reponse for a recent question in my development team:

Is it possible to implement a new artifact that uses MicroProfile API within Java EE 7? May I use this artifact also in a Java EE 8 Server?

To answer this question, I prepared a POC to demonstrate Java EE capabilities.

Is Java EE backward compatible? Is it safe to assume a clean migration from EE 7 to EE 8?

One of the most pervasive rules in IT is "if ain't broke, don't fix it", however the broke is pretty relative in regards of security, bugs and features.

Commonly, security vunlerabilities and regular bugs are patched through vendor specific updates in Java EE, retaining the feature compatibility through EE API level, hence this kind of updates are considered safer and should be applied proactively.

However, once a new EE versión is on the streets, each vendor publish it's product calendar, being responsable of the future updates and it's expected that any Java EE user will update his stack (or perish :) ).

In this line Java EE has a complete set of requrimentes and backward compatibility instructions, for vendors, spec leads and contributors, this is specially important considering that we receive on every version of Java EE:

  • New APIs (like Batch in EE 7 or JSON-B in EE 8)
  • APIs that simply don't change and are included in the next EE versión (like Batch in EE 8)
  • APIs with minor updates (Bean Validation in EE 8)
  • APIS with new features and interfaces (reactive client in JAX-RS EE 8)

According to compatibility requirements, if your code retains and implements only EE standard code you receive source-code compatibility, binary compatibility and behaviour compatibility for any application that uses a previous version of the specificiation, at least that's the idea.

Creating a "complex" implementation

To test this assumption I've prepared a POC that implements

  • Servlets (updated in EE 8)
  • JAX-RS (updated in EE 8)
  • JPA (minor update in EE 8)
  • Batch (does not change in EE 8)
  • MicroProfile Config (extension)
  • DeltaSpike Data (extension)

Batch Structure

This application just loads a bunch of IMDB records from a csv file in background to save the records in Derby(Payara 4) and H2(Payara 5) using the jdbc/__default JTA Datasource.

For referece, the complete Maven project of this POC is available at GitHub.

Part 1: File upload

The POC a) implements a multipart servlet that receives files from a plain HTML form, b) saves the file using MicroProfile config to retreive the final destination URL and c) Calls a Batch Job named csvJob:

@WebServlet(name = "FileUploadServlet", urlPatterns = "/upload")
public class FileUploadServlet extends HttpServlet {
    @ConfigProperty(name = "file.destination", defaultValue = "/tmp/")
    private String destinationPath;

    private Logger logger;

    protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        String description = request.getParameter("description");
        Part filePart = request.getPart("file");
        String fileName = Paths.get(filePart.getSubmittedFileName()).getFileName().toString();

        //Save using buffered streams to avoid memory consumption
        try(InputStream fin = new BufferedInputStream(filePart.getInputStream());
                OutputStream fout = new BufferedOutputStream(new FileOutputStream(destinationPath.concat(fileName)))){

            byte[] buffer = new byte[1024*100];//100kb per chunk
            int lengthRead;
            while ((lengthRead = > 0) {

            response.getWriter().write("File written: " + fileName);

            //Fire batch Job after file upload
            JobOperator jobOperator = BatchRuntime.getJobOperator();
            Properties props = new Properties();
            props.setProperty("csvFileName", destinationPath.concat(fileName));
            response.getWriter().write("Batch job " + jobOperator.start("csvJob", props));
            logger.log(Level.WARNING, "Firing csv bulk load job - " + description );

        }catch (IOException ex){
            logger.log(Level.SEVERE, ex.toString());

            response.getWriter().write("The error");



You also need a plain HTML form

<h1>CSV Batchee Demo</h1>
<form action="upload" method="post" enctype="multipart/form-data">
    <div class="form-group">
        <label for="description">Description</label>
        <input type="text" id="description" name="description" />
    <div class="form-group">
        <label for="file">File</label>
        <input type="file" name="file" id="file"/>

    <button type="submit" class="btn btn-default">Submit</button>

Part 2: Batch Job, JTA and JPA

As described in Java EE tutorial, typical batch Jobs are composed by steps, these steps also implement a three phase process involving a reader, processor and writer that works by chunks.

Batch Job is defined by using a XML file located in resources/META-INF/batch-jobs/csvJob.xml, the reader-writer-processor triad will be implemented through named CDI beans.

<?xml version="1.0" encoding="UTF-8"?>
<job id="csvJob" xmlns=""
    <step id="loadAndSave" >
        <chunk item-count="5">
            <reader ref="movieItemReader"/>
            <processor ref="movieItemProcessor"/>
            <writer ref="movieItemWriter"/>

MovieItemReader reads the csv file line per line and wraps the result using a Movie object for the next step, note that open, readItem and checkpointInfo methods are overwritten to ensure that the task restarts properly if needed.

public class MovieItemReader extends AbstractItemReader {

	private JobContext jobContext;

	private Logger logger;

	private FileInputStream is;
	private BufferedReader br;
	private Long recordNumber;

	public void open(Serializable prevCheckpointInfo) throws Exception {
		recordNumber = 1L;
		JobOperator jobOperator = BatchRuntime.getJobOperator();
		Properties jobParameters = jobOperator.getParameters(jobContext.getExecutionId());
		String resourceName = (String) jobParameters.get("csvFileName");
		is = new FileInputStream(resourceName);
		br = new BufferedReader(new InputStreamReader(is));

		if (prevCheckpointInfo != null)
			recordNumber = (Long) prevCheckpointInfo;
		for (int i = 0; i < recordNumber; i++) { // Skip until recordNumber
		logger.log(Level.WARNING, "Reading started on record " + recordNumber);

	public Object readItem() throws Exception {

		String line = br.readLine();

		if (line != null) {
			String[] movieValues = line.split(",");
			Movie movie = new Movie();
			// Now that we could successfully read, Increment the record number
			return movie;
		return null;

	public Serializable checkpointInfo() throws Exception {
	        return recordNumber;

Since this is a POC my "processing" step just converts the movie title to uppercase and pauses the thread a half second on each row:

public class MovieItemProcessor implements ItemProcessor {

  private JobContext jobContext;

  public Object processItem(Object obj) 
          throws Exception {
      Movie inputRecord =
              (Movie) obj;
      //"Complex processing"
      return inputRecord;

Finally each chunk is written on MovieItemWriter using a DeltaSpike repository:

public class MovieItemWriter extends AbstractItemWriter {

    MovieRepository movieService;
	Logger logger;

    public void writeItems(List list) throws Exception {
        for (Object obj : list) {
        	logger.log(Level.INFO, "Writing " + obj);

For reference, this is the Movie Object

public class Movie implements Serializable {

	public String toString() {
		return "Movie [name=" + name + ", releaseYear=" + releaseYear + "]";

	private static final long serialVersionUID = 1L;

    @GeneratedValue(strategy= GenerationType.AUTO)
    private int id;
    private String name;
    private String releaseYear;

    //Getters and setters

Default datasource is configured on resources/META-INF/persistence.xml, note that I'm using a JTA Data Source:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<persistence xmlns=""

    <persistence-unit name="batchee-persistence-unit" transaction-type="JTA">
        <description>BatchEE Persistence Unit</description>
		      <property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
		      <property name="javax.persistence.schema-generation.scripts.action" value="drop-and-create"/>
		      <property name="javax.persistence.schema-generation.scripts.create-target" value="sampleCreate.ddl"/>
		      <property name="javax.persistence.schema-generation.scripts.drop-target" value="sampleDrop.ddl"/>

To test JSON marshalling throug JAX-RS I also implemented a Movie endpoint with GET method, the repository (AKA DAO) is defined by using DeltaSpike

@Produces({ "application/xml", "application/json" })
@Consumes({ "application/xml", "application/json" })
public class MovieEndpoint {
	MovieRepository movieService;

	public List<Movie> listAll(@QueryParam("start") final Integer startPosition,
			@QueryParam("max") final Integer maxResult) {
		final List<Movie> movies = movieService.findAll();
		return movies;


The repository

@Repository(forEntity = Movie.class)
public abstract class MovieRepository extends AbstractEntityRepository<Movie, Long> {
    public EntityManager em;

Test 1: Java EE 7 server with Java EE 7 pom

Since the objective is to test real backward (lower EE level than server) and forward (Micprofile and DeltaSpike extensions) compatibility, first I built and deployed this project with the following dependencies on pom.xml, the EE 7 Pom vs EE 7 Server test is only executed to verify that project works properly:


As expected, the application loads the data properly, here two screenshots taken during batch Job Execution:

Payara 4 Demo 1

Payara 4 Demo 2

Test 2: Java EE 8 server with Java EE 7 pom

To test the real binary compatibility, the application is deployed without changes on Payara 5 (Java EE 8), this Payara release also switches Apache Derby with H2 database.

As expected and according with Java EE compatibility guidelines, the application works flawesly.

Payara 5 Demo 1

Payara 5 Demo 2

To verify assumptions, this is a query launched through SQuirrel SQL:
SQuirrel SQL Demo

Test 3: Java EE 8 server with Java EE 8 pom

Finally to enable new EE APIs, a little bit of tweaking is needed on pom.xml, specifically the JavaEE dependency


Again, the application just works:

Payara 5 Java EE 8

This is why standards matters :).

Getting started with Java EE 8, Payara 5 and Eclipse Oxygen

04 June 2018

Some days ago I had the opportunity/obligation to setup a brand new Linux(Gentoo) development box, hence to make it "enjoyable" I prepared a back to basics tutorial on how to setup a working environment.


In order to setup a complete Java EE development box, you need at least:

1- Working JDK installation and environment
2- IDE/text editor
3- Standalone application server if your focus is "monolithic"

Due personal preferences I choose

1- OpenJDK on Gentoo Linux (Icedtea bin build)
2- Eclipse for Java EE developers
3- Payara 5

Installing OpenJDK

Since this is a distribution dependent step, you could follow tutorials on Ubuntu, CentOS, Debian and many more distributions if you need to. At this time, most application servers have Java 8 as target due new Java-LTS version scheme, as in the case of Payara.

For Gentoo Linux you could get a new OpenJDK setup by installing dev-java/icedtea for the source code version and dev-java/icedtea-bin for the precompiled version.

emerge dev-java/icedtea-bin

Is OpenJDK a good choice for my need?

Currently Oracle has plans to free up all enterprise-commercial JDK features. In a near future the differences between OracleJDK and OpenJDK should be zero.

In this line, Red Hat and other big players have been offering OpenJDK as the standard JDK in Linux distributions, working flawlessly for many enterprise grade applications.

Eclipse for Java EE Developers

After a complete revamp of websites GUI, you could go directly to website and download Eclipse IDE.

Eclipse offers collections of plugins denominated Packages, each package is a collection of common plugins aimed for a particular development need. Hence to simplify the process you could download directly Eclipse IDE for Java EE Developers.

On Linux, you will download a .tar.gz file, hence you should uncompress it on your preferred directory.

tar xzvf eclipse-jee-oxygen-3a-linux-gtk-x86_64.tar.gz

Finally, you could execute the IDE by entering the bin directory and launching eclipse binary.

cd eclipse/bin

The result should be a brand new Eclipse IDE.


You could grab a fresh copy of Payara by visiting website.

From Payara's you will receive a zipfile that again you should uncompress in your preferred directory.


Finally, you could add Payara's bin directory to PATH variable in order to use asadmin command from any CLI. You could achieve this by using ~/.bashrc file. For example if you installed Payara at ~/opt/ the complete instruction is:

echo "PATH=$PATH:~/opt/payara5/bin" >> ~/.bashrc

Integration between Eclipse and Payara

After unzipping Payara you are ready to integrate the app server in your Eclipse IDE.

Recently and due Java/Jakarta EE transition, Payara Team has prepared a new integration plugin compatible with Payara 5. In the past you would also use Glassfish Developer Tools with Payara, but this is not possible anymore.

To install it, simply grab the following button on your Eclipse Window, and follow wizard steps.

Drag to your running Eclipse* workspace. *Requires Eclipse Marketplace Client

In the final step you will be required to restart Eclipse, after that you still need to add the application server. Go to the Servers tab and click create a new server:

Select Payara application server:

Find Payara's install location and JDK location (corresponding to ~/opt/payara5 and /opt/icedtea-bin on my system):

Configure Payara's domain, user and password.

In the end, you will have Payara server available for deployment:

Test the demo environment

It's time to give it a try. We could start a new application with a Java EE 8 archetype, one of my favorites is Adam Bien's javaee8-essentials-archetype, wich provides you an opinionated essentials setup.

First, create a new project and select a new Maven Project:

In Maven's window you could search by name any archetype in Maven central, however you should wait a little bit for synchronization between Eclipse and Maven.

If waiting is not your thing. You could also add the archetype directly:

This archetype also creates a new JAX-RS application and endpoint, after some minor modifications just deploy it to Payara 5 and see the results:

Fixing missing data on Jasper Reports with community Linux distros

14 March 2018

. . . or why my report is half empty in Centos 7.

One of the most common and least emotional tasks in any enterprise software is to produce reports. However after many years today I got my first "serious bug" in Jasper Reports.

The problem

My development team is composed by a mix of Ubuntu and Mac OS workstations, hence we could consider that we use user-friendly environments. Between many applications, we have in maintenance mode a not-so small accounting module which produces a considerable amount of reports. This applications is running (most of the times) on Openshift (Red Hat) platforms or on-premise (also Red Hat).

A recent deployment was carried over a headless(pure cli) CentOS 7 fresh install and after deploying application on the app server, all reports presented the following issue:

Good report, Red Hat, Mac Os, Ubuntu

Bad report, Centos 7

At first sight both reports are working and equal, however in the Centos 7 version, all quantities disappeared and the only "meaningful" log message related to fonts was:

[org.springframework.beans.factory.xml.XmlBeanDefinitionReader] (default task-1) Loading XML bean definitions from URL [vfs:/content/erp-ear.ear/core-web.war/WEB-INF/lib/DynamicJasper-core-fonts-1.0.jar/fonts

Fonts in Java

After many unsuccessful tests like deploying a new application server version, I learnt a little bit about fonts in the Java virtual machine and Jasper Reports.

According to Oracle official documentation, you basically have four rules while using fonts in Java, being:

  1. Java supports only TTF and Postscript type 1 fonts
  2. The only font family included in JDK is Lucida
  3. If not Lucida JDK will depend on operating system fonts
  4. Java applications will fallback to the sans/serif default font if the required font is not present on the system

And if you are a Jaspersoft Studio, it makes easy for you to pick Microsoft's True Type fonts

My CentOS 7 solution

Of course Jasper Reports has support for embedding fonts, however report's font was not Lato, Roboto or Sage, it was the omnipresent Verdana part of the "Core fonts for the web" from Microsoft, not included in most Unix variant due license restrictions.

Let's assume that nowadays MS Core Fonts are a gray area and you are actually able to install these by using repackaged versions, like mscorefonts2 at sourceforge.

In CentOS is easy as 1) install dependencies,

yum install curl cabextract xorg-x11-font-utils fontconfig

2) download the repackaged fonts


3) and install the already available rpm file

yum install msttcore-fonts-2.1-1.noarch.rpm

With this, all reports were fixed. I really hope that you were able to found this post before trying anything else.

2017 My Java Community Year

30 December 2017

Yup it's the time of the year when everybody writes about yearly adventures.

This year was one of the best for my wanderlust spirit, basically due opportunities that I received as member of the global Java community, hence I wanna do a little recap of my Java year in my academic and professional journey, so far Java has been the strongest community I've ever participated. This is probably the year where I made most Java-related friends.

As always Java community is awesome and I wanna start this post saying thank you to all the people that made this possible, I really hope that my presentations, tutorials and workshops were at least at the same level of your expectations, I'm always trying to improve the quality and any feedback is more than welcome.

Let's start with the highlights recap . . .


January was an interesting month, my first MOOC collaboration -Java Fundamentals for Android Development- was published in partnership with Galileo University. Hence I became an official Edx instructor, with Edx profile and everything.

I really think that course quality could be improved, however feedback was better than I expected. I hope to collaborate in another MOOC in the near future, the simple act to prepare a course in a foreign language for an Edx Micromaster was a truly learning experience that I wanna repeat.

Finally, MOOC will run again this year, motivated by good reception . . . I guess/hope :).


In February I discovered my second favorite Java conference. After being accepted I traveled to Atlanta to speak at DevNexus.

With the same quality of Java One speakers but in a more "friendly" atmosphere, DevNexus has a good balance between, speakers, price and quality, I had the opportunity to do a conference on Functional Programming and Functional Java libraries.

I did an interview with night hacking :).

I also met Coca Cola bear at World of Coca Cola, life achievement unlocked.

Finally I'll be speaking again in 2018, hope to see you there.


Not a conference but my company -Nabenik- became a Payara reseller and we helped one unit of "Ministerio Publico" (aka Guatemalan FBI) in the migration process from Glassfish to Payara.

This was a combination between training, analysis and of course partnership. Cheers to everyone involved in the process :).

I also repeated my DevNexus presentation for Peru JUG friends (in spanish).


After being invited by Jorge Vargas, I joined JEspañol as speaker and collaborator. In short words, JEspañol aims to work as the integration point between Java Community in spanish speaking countries, also as the first spanish Virtual JUG.

This year we created a two-leg conference. Being the first leg the "Primera conferencia virtual JEspañol". With speakers from Guatemala, Mexico, Peru, Colombia and Panama.


In a collaboration with other company we trained one unit of the "Nicaraguan Tax administration", my collaboration was focused on secure software development for JavaEE and Android.

A pretty good experience since this was my third visit to Nicaragua.


Taking advantage from Nicaraguan trip I did a joint presentation with Managua Google Developers Group, talking about Java Community, Duke's Choice Award, Consultancy and many Java related technologies. Sharing pizza, beer and of course code.

I also had the opportunity to say hi to old friends from free software community, remembering the good old times of unix hacktivism.


Busy month, not too much hacktivism :).


The second leg of the "JEspañol conference" was carried-on being named "Java Cloud Day Mexico". One of my favorite trips of the year since it took place on the historical "Universidad Nacional Autonoma de Mexico".

With 15 speakers this was my biggest spanish-speaking conference. On the near future we have the intention to take the JEspañol conference to other LATAM countries.

In this opportunity I did a presentation on functional microservices with Payara Micro.

Finally we closed this event with peers from Oracle Developers LA.

I also got a ticket for Wacken Open Air, not Java but still amazing :).


I did a joint presentation with Edson Yanaga at Java One 2017, with full room (mostly Yanaga's merit I must say) we discussed the issues of distributed data with microservices and some technology to attain better data distribution, microservices patterns and some tips and tricks.

Not my best presentation, but anyways feedback is appreciated.

JEspañol got a Duke's Choice Award :).


I received an invitation from peruvian Universidad Privada Antenor Orrego to participate in their Engineering Congress . . . yup in Latam Software Development is considered Engineering.

This was my first time in South America since 2014, the emotion for the travel was pretty obvious. I had the opportunity to do three presentations, being:

  • Getting started with Java community and development career
  • The state of the art with JavaEE
  • Creating functional microservices with Payara Micro

I also did a little trip to Chan Chan ruins, a UNESCO-Protected Site, mochicas where a non-Inca culture dominated by the Incas that I didn't know about.

On trip's last day I had a long layover on Lima, hence I met with Jose Diaz one of the PeruJUG Leaders to do a joint presentation in "Universidad Nacional Mayor de San Marcos" the oldest University in Latin America. Presenting in their systems engineering congress.

We did a panel about microservices on JavaEE vs Spring Boot.


After six years I returned to my "alma mater" to do a presentation on "Getting started with Java 8 and Java EE" for the "Guatemalan systems and computer science students congress". I felt like being at home.

I also took the opportunity to officially launch the partnership with CertificaTIC to introduce Oracle Workforce Development program at Guatemala. The program aims to train with official material but with a fraction of the cost the new generation of Java Architects in Guatemala.

Tip: We'll start classes on January, do not miss the opportunity :).

Finally I was an organizer, committee, waterboy, etc. on the Java Day Guatemala 2017, surprisingly this has been the best edition in attendance. Being a conference organizer is a huge amount of work. I know the huge space to make this conference better and I'm working on it.


December is probably the busiest month for any software contractor since it's the month when everyone goes on vacations and you have wider "maintenance windows" to update, replace and implement software systems.

However I took a day to do a presentation with MitoCode, one of the famous Java youtubers in Latin America. My presentation was focused on the state of the art in JavaEE and Microprofile.

I also got the opportunity to wrote a JavaEE article for my "alma mater" software magazine, the pre-print is available on my spanish blog.

Again thanks to God, friends, family and everyone who made this possible.

[Quicktip] How to install Oracle Java 9 on Fedora 26

28 September 2017

Yey! It's time to celebrate the general availability of Java 9.

Since I'm the guy in charge of breaking the things before everyone else in the company, I've been experimenting with JDK 9 and Fedora 26 Workstation, hence this quick guide about installing Oracle JDK 9 using the "Fedora Way".

Getting the JDK

As always when a JDK reaches general availability, you could download a Java Developer Kit from Oracle website.


Conveniently Oracle offers a .rpm package that it's supposed to work with any "Hat" distribution. This guide is focused on that installer.

Installing the new JDK

After downloading the rpm, you could install it as any other rpm, at the time of writing this tutorial, the rpm didn't required any other dependency (or any dependency not available in Fedora)

sudo rpm -ivh jdk-9_linux-x64_bin.rpm

This command was executed as super user.

Configuring the JDK

"Hat" distributions come with a handy tool called alternatives, as the name suggests it handles the alternatives for the system, in this case the default JVM and compiler.

First, set the alternative for the java command

sudo alternatives config --java

It will list the "Red Hat packaged" JVM's installed on the system, for instance this is the output in my system (Oracle JDK 8, Oracle JDK 9, OpenJDK 8):

Later, you should also pick a compiler alternative

sudo alternatives config --javac

Configuring JShell REPL

JShell is one of the coolest features in Java 9, being the first official REPL to be included. However and since it's the first time that the binary is available in the system, it cannot be selected as alternative unless you create it manually.

First, locate the JDK install directory, Oracle JDK is regularly located at /usr/java, being in my system.

As any other JVM binary program, JShell will be located at bin directory, hence to create an alternative (and consequently to be prepared for other Java 9 options . . . and to include the executable in the path):

sudo alternatives --install /usr/bin/jshell jshell /usr/java/jdk-9/bin/jshell

From now on you could use jshell on any regular shell, just see my first Java-9 hello world, it looks beautiful :-).

Java Cloud Day México 2017 trip report

04 September 2017

I'm not accustomed to do "trip reports" of conferences, but I think this conference deserves the report.

Java Cloud Day 2017 was the result of a dream from the JEspañol community, a simple one in phrase but very difficult to achieve "unite Java Leaders from Spanish speaking countries in Latin America", dream that started three years ago. Although I'm not one of the "original members", last year I joined the effort after some pub-discussions in Java One.

Honor to whom honor is owed, one of the key sponsors who made this possible was CertificaTIC, a mexican Java-Oracle certification services provider who believed in the project and put a lot of resources for the event. Adrian, Mirza and the whole team . . . you rock!

The previous virtual conference

The event as a whole was carried as two conferences. The first leg called "conferencia virtual JEspañol" with VirtualJUG-Like conferences but in Spanish with speakers from Guatemala, Mexico, Panama, Peru and Colombia, being the first of its kind.

Java Cloud Day Mexico 2017

And the day finally came . . .

Celebrated in conjunction with the 40th anniversary of the computer science program in the mythical Universidad Nacional Autonoma de México (UNAM), the event was a huge success.

With talks from Java Champions, Oracle Developer Champions, Oracle ACE, JUG Leaders, Oracle Technology Network and "Javatars", this is the biggest IT Spanish-speaking event I've ever attended, raising the bar for all of us in our next events.

Kudos, bitcoins, stamina, japanese ki and whatever is valuable for all participants and speakers. Events like these are my favorite since (and taking a famous phrase from DevNexus) this was an event from developers, by developers for developers.

The pictures

Some selected pictures from the event, dinner, and Teotihuacan.

My presentation

I presented some tip and tricks for the creation of Microservices with Payara Application Server, the presentation starts at minute ~45

See you in the next conference . . . or tour, who knows :)?

Payara 4.1 install guide on CentOS 7

25 August 2017

In this "back to basics tutorial" I'll try to explain how to install properly Payara 4.1 on Centos 7 (it should be fine for Red Hat, Oracle Unbreakable Linux and other *hat distributions).

Why not Docker, Ansible, Chef, Puppet, . . .?. Sometimes the best solution is the easiest :).


The only software requirement to run Payara is to have a JDK installed. CentOS offers a custom OpenJDK build called "headless" that doesn't offer support for audio and video, perfect for cli environments.

You can install it with Yum.

yum install java-1.8.0-openjdk-headless

Create a Payara user

As mentioned in the official Glassfish documentation, it is convenient to run your application server in a dedicated user for security and administrative reasons.

adduser payara

Although you could be tempted to create a no-login, no-shell user, Payara saves many preferences in user's home directory and the shell/login is actually needed to execute administrative commands like asadmin.

Download and unzip Payara

Payara is hosted at Amazon S3, please double check this link on Payara's website, for this guide I'm installing Payara at server's /opt directory.

cd /opt

You should execute the above commands as super-user, after that you should change permissions for the Payara directory before any domain start. Otherwise you won't be able to use the server with payara user

chown -R payara:payara payara41

systemd unit

Centos 7 uses systemd as init system, consequently it is possible and actually quite easy to create a systemd unit to start, stop and restart Payara default domain
First, create a file that represents Payara systemd unit.


And add the following content:

Description = Payara Server v4.1
After =

ExecStart = /usr/bin/java -jar /opt/payara41/glassfish/lib/client/appserver-cli.jar start-domain
ExecStop = /usr/bin/java -jar /opt/payara41/glassfish/lib/client/appserver-cli.jar stop-domain
ExecReload = /usr/bin/java -jar /opt/payara41/glassfish/lib/client/appserver-cli.jar restart-domain
Type = forking

WantedBy =

Note that Payara administration is achieved with payara user. You could personalize it to fit your needs.

Optionally you could enable the service to start with the server and/or after server reboots.

systemctl enable payara

Check if all is working properly with the systemd standard commands

systemctl start payara
systemctl restart payara
systemctl stop payara

Payara user PATH

As Payara and Glassfish users already know, most administrative tasks in Payara application server are achieved by using cli commands like asadmin. Hence it is convenient to have all tools available in our administrative payara user.

First log-in as Payara user, if you didn't assign a password for the user you could switch to this user using su as root user.

su payara

Later you shoud create or edit the .bashrc file for the Payara user, the most common location being user's home directory.

cd /home/payara
vim .bashrc

And add the following line:

export PATH=$PATH:/opt/payara41/glassfish/bin

Final result

If your setup was done properly you should obtain an environment like in the following screenshot:

Note that:

  • I'm able to use systemd commands
  • It actually restarded since PID changed after systemd restart
  • asadmin works and displays properly the running domain

[Quicktip] Move current JVM on JBoss Developer Studio

28 January 2017

JBoss Developer Studio

After a routinary JVM update I had the idea of getting rid of old JVMs by simply removing the install directories. After that many of my Java tools went dark :(.

For JBoss Developer Studio, I recevied the following "welcome message".

At the time, I deleted the 1.8.111 version from my system, preserving only the 1.8.121 version as the "actual" version.

Fixing versions

Different from other tools, Eclipse-based tools tend to "hardcode" the JVM location with the install process inside eclipse.ini file. However and as I stated in a previous entry, JBoss Developer Studio converts this file to jbdevstudio.ini, hence you should fix the JVM location in this file.

Finding the file in Mac OS(X?)

If you got JDevStudio with the vanilla installer, the default location for the IDE would be /Applications/devstudio and the tricky part is that configuration is inside "Devstudio" OSX Package.

If you wanna fix this file from terminal the complete path for the file would be:


Of course you could always open packages in Finder:

After opening the file the only fix that you need to do is to point to the installed virtual machine, being in my PC (Hint: You could check the location by executing /usr/libexec/java_home):


And your file should look like this:

Join Me at @devnexus

19 December 2016


Devnexus is probably (and according to this video) the second biggest Java Conference in United States, hence one of the top ten Java conferences in America.

Organized by the Atlanta Java Users Group, you could choose between 120+ presentations, 14 tracks and 7 workshops, in order to catch-up with recent topics on Java and Enterprise development world, a must for any Java Developer looking for knowledge.

Devenexus 2017 will be my first attendance and I'm honored to share that one of my presentations was accepted :).

Reaching the lambda heaven -

In this presentation i'll give a practical introduction to functional/reactive programing in Java for the "street developer" and we'll be reviewing a list of useful libraries to make elegant and functional (as in the paradigm) web applications.

Why this and not other conferences?, in my opinion:
- It's held on a cheaper city compared to California conferences
- It's a conference from developers for developers
- Atlanta Jug guys are awesome
- Java still rocks

I hope to see you there

[Quicktip] Add Eclipse Marketplace Client to JBoss Developer Studio

11 October 2016

JBoss Developer Studio

Recently I'm in the need to do some reporting development. Although I've used iReport in the past I'm pretty comfortable with having a one stop solution, being at this time Red Hat JBoss Developer Studio (RHJDS).

Despite the fact that it contains an small marketplace, at this time it lacks a proper reporting solution like JasperSoft Studio, hence I was in the need of installing it over my current RHJDS install.

For those in the need of installing Eclipse Marketplace plugins, you can get it with two simple steps:

Add support for Eclipse Update Sites

Create a new repository with the following sequence
Help -> Install New Software -> Add

The update URL for Eclipse Neon is:

And choose a good name for the repo ("Eclipse Neon" maybe?)

Search for Eclipse Marketplace Client

And install it . . .

So far I'm not pretty sure about the potential impacts of this mix of repositories, however you'll be able to install any Eclipse Marketplace plugin, and it looks good.

See you at Java One and WITFOR 2016

08 September 2016

Java Speaker

The last quarter of the year is maybe my favorite period of time, since 2012 I try to attend at least to one "big" conference and this year won't be an exception.

For me conferences are an opportunity to meet new people, have fun, and of course travel to new/favorite places, and this year I'll have the opportunity to attend two big conferences.

From September 12 to 15 I'll be attending WITFOR 2016 as a member of the guatemalan academic community. Since my Msc. I've missed the serious academic environment, conferences, CFP, basically because my country(Guatemala) is still creating an academic system. Hence I have to help (a little bit at least) in my University.

Later jumping from one plane to another (literally) I'll travel to attend J1. As I previously stated, J1 was a "TODO in life" I had the opportunity to attend the past year and collaborate with some community activities, however this year(between other good news) one of my proposals was accepted, so I'm trying to be ready to chill-out with some Java community peers and share the things that we are doing in Guatemala's software industry.

See you at J1 and WITFOR 2016.

Java Day Tour 2016 Recap

14 July 2016

Java Flight

A year ago I talked about GuateJUG's Java Day Tour in this blog, one year has passed (time flies!) and it's time to do another recap of GuateJUG activities.

As a matter of fact the tour wasn't a tour in first place, we created it as a coincidence of multiple IT conferences where GuateJUG was invited where we promoted our yearly conference, however at this year we formalized the concept opening a "Call for JUG" in our website and social networks with a clear mission and vision:

>Mision: Help universities, high schools and IT development centers with Java-centered conferences in order to update IT knowledge in the not-so developed regions of Guatemala

>Vision: Travel through Guatemala and have fun while promoting our conference and Java

Why is this important?

Guatemala is one of the Central American countries with less college graduates, only 10% of the population could access to college education and only 10% of that population (hence 1%) actually gets a degree, with the addition that most of the people choose a non-STEM degree.

Anyways as most of you probably know, you actually don't need a degree to be a rockstar in Software Development, and for this many people in the country is gaining interest in Computer Science in order to get better job opportunities and GuateJUG is glad to be a part of that change.

The tour

To finance the tour expenses we created an inverted call for papers, any group, association, university, high school could choose between a bunch of speakers and a fixed set of talks, being:

GuateJUG put the time plus experience and they put the venue and accomodation. Contrary what we initially expected (and sadly with most success than our yearly conference), GuateJUG tour was a hit. We successfully participated in many talks in the following cities:

Some random photos from the trips

With new dates in the following cities:
* Huehuetenango, Huehuetenango (sponsored by Universidad da Vinci)
* Chiquimula, Chiquimula (sponsored by Universidad de San Carlos)

So far we've promoted Java with more than 800 assistants between all regions, a huge success for guatemalan parameters.

Lessons that we've learnt

Although GuateJUG was supposed to be a central part to teach Java, as speakers we've learnt more than we expected in the first trip.

Quality comments aside, most of the people that made this tour possible were glad with us, and they wish that more IT communities could have an impact for all country regions.

You can't be sure if an informal chit-chat in a IT conference would become a great opportunity, in our case an informal chit-chat in a Java meeting finished with the first independent technological tour in our country (AFAIK).

For us the Java Day Tour was an opportunity to rediscover our country, make friends in every country region, and have not only IT but life experiences that I'm pretty sure I wouldn't have without a proper excuse to take my backpack, laptop, and share a little bit of knowledge that I've had the opportunity to acquire.

For the students, Java Day Tour was an opportunity to ask any question about Software Engineering, Computer Science, Women in STEM, Android, Life outside their regions, IT careers, and or course Java, and I'm glad for being part of that.

Long live to Java!

First impressions of Zulu on OSX

11 February 2016


Although my first development box at this time is an Apple PC, I'm pretty comfortable with Open Source software due technical and security benefits.

In this line, I've been a user of Zulu JVM on my Windows+Java deployments, basically because Zulu offers a zero-problems deployment of OpenJDK, being at this time the only Open Source and production ready JVM available for Windows (in Linux you also have OpenJDK distro builds or IcedTea).

As my previous experiences in Windows and considering that Zulu is at some point a supported compilation of OpenJDK (basis for HotSpot aka OracleJDK), using Zulu in OSX has been so far a painless solution.

Compared to HotSpot you have an additional security "feature" only available at Server JRE, the lack of the infamous web plugin, however you will lose Java Mission Control because is a closed source tool.

If you are interested in formal benchmarks this guide (by Zulu creators BTW) could be helpful:

In a more "day by day" test, no matter if you are using Eclipse


Or maybe Vuze

Zulu simply works.

¿Why should you consider Zulu?

  • Is based on OpenJDK Open Source project
  • You obtain nearly the same performance of HotSpot
  • You can bundle it on your apps (like Microsoft in Azure)

¿Why avoid Zulu?

  • If you need Java Mission Control
  • If you need the Java Browser Plugin, pretty dead BTW

[Quicktip] Fix font size in JBoss Developer Studio and OSX

02 February 2016

JBoss Developer Studio

One of the most annoying things of using JBoss Developer Studio on OSX is the default configuration, specially the font size.

By default Eclipse (and consequently Jbossdevstudio) has a startup parameter called "-Dorg.eclipse.swt.internal.carbon.smallFonts" and as its name suggests, it enforces the usage of the smallest font at the system.

Although default settings are tolerable on retina displays:

The problem gets worse on regular 1080p screens (like external non-mac displays):

As you probably guess the solution of this issue is to delete the parameter, but the tricky part is that JBoss Developer Studio renames the eclipse.ini file (default Eclipse configuration) to jbdevstudio.ini, making most of the how-to guides at internet "complicated".

Anyway the file is located at:


Being the default location:


With this your eyes will be gratefull (actual 1080p screenshot):

20 years of Java in Java One and Java Day Guatemala

23 November 2015

Java 20

For me 2015 hacktivism is officially closed, this year I had the opportunity to attend my first Java One and met some of my development heroes, folks from another JUGs, being trapped at Patricia's hurricane (no kidding), and bring a huge amount of Java t-shirts to Guatemala. I must say thanks to the people that made this trip possible, special acknowledgments to Nichole Scott from Oracle, my peers at Nabenik that gratefully sponsored my trip and the Guatemala Java User Group for allowing me to take advantage of the Java One ticket :-D.

I think that every techie has a list of conferences to attend before die, as 2015 my top five is:

  • JavaOne (2015 finally)
  • Gentoo Miniconf
  • Forum Internacional de Sofware Livre - FISL (2013 as a speaker yey!)
  • Convención de Informatica Guatemala (2005 - now dead, but it made the list for nostalgia)

Although I had some health issues due stress, Java One and San Francisco were lifetime experiences. As each conference I've attended, it has its goods and bads but in general you can feel a strong sense of community between attendants, from Pivotal to Red Hat (and Microsoft), from the peer that shares a beer with you and turns out to be a Java Champion to the people that speaks another languages with you, Java One is about community making awesome things in IT. Programming languages aren't eternal but I can state that the differential factor that raised 20-year languages like Java and JavaScript among the others is the community.

Random photos:

In a different scale but in the same sense of community, the Guatemala Java User Group held its yearly conference Java Day Guatemala.

I've been involved directly in the organization of 2 Java Days and spoken in three. As I said in our keynote, for me GuateJUG has been the most successful user group where I've participated. Characterized by pragmatism, openness and community structure since its inception, GuateJUG became one of the strongest user groups in Central America, the integration between industry, academia, Open Source communities, Free Software communities and HR people looking for the next generation of developers is unique.

As a special occasion we held a traditional birthday celebration, including birthday cake and mexican piñatas. It was great to share words with some old friends and meet new IT enthusiasts, as a matter of fact we also sang "Happy birthday Java and happy birthday GuateJUG".

Video and some more random photos:

I hope to see you the next year in Java One and Java Day :-).

[Book Review] Java EE 7 Essentials, O’Reilly

12 November 2015

JavaEE 7 Essentials Cover

About the book

Pages: 362
Publisher: O’Reilly Media
Release: Aug 2013
ISBN-10: 978-1-4493-7016-9
ISBN-13: 1-4493-7016-0

Book details

I received this book as a part of the now dead O'Reilly users group program. When I asked for this book I was specially interested due comments from my development peers . . . and most importantly because I was in the middle of a Software Architecture definition.

I'm writing this review after 7 months of using it on daily basis, basically because our development stack is composed by AngularJS on the front-end and JavaEE 7 on the back-end (with a huge bias to the Hat company). At the office we have a small books collection (because IT books are pretty dead after five years), and Aurun's book is our prefered book for the "Java EE 7 rescue kit".

If I have to choose two adjectives for this book I must say "quick and versatile", this book deserves all of its fame because it has the balance between a good reference book and a user friendly introductory book, most of the IT books don't achieve it.

I don't wanna copy the index page but I have the following favorite chapters:

  • Servlets
  • RESTFul Web Services
  • SOAP Web Services
  • JSON Processing
  • Enterprise Java Beans
  • Context and Dependency Injection
  • Bean Validation
  • Java Transactions
  • Java Persistence

Most of the book samples are based on Glassfish, and is easy to guess why looking at the publication date. However, talking from my true-heavy-metal-monkey-developer-architect experience, this book uses only pure JavaEE 7 apis and I've been able to run/use the samples on Wildfly without issues.

For those that are looking a good book for JavaEE 7 development, on any of the certified Java EE 7 servers this is a must.


  • Good balance between tutorial and reference.
  • Few content compared to the Java EE Tutorial but still in the point.
  • The samples should work on any Java EE 7 server.

Could be better

  • WebSockets section is small in relation to the other chapters, it feels incomplete.
  • The cover brings to my mind the good old days when Glassfish was that application server that everybody is talking.

GuateJUG Tour 2015 First Leg

22 October 2015

As I described previously, GuateJUG held a conference circuit promoting its yearly conference Java Day Guatemala 2015.

This activity was motivated by Java's 20th anniversary and specially due GuateJUG's 5th anniversary.

So... how do you achieve a tour that traverses Guatemala?. Easy, with the right sponsors and contacts :).

An special acknowledgment for the people that made this first leg possible:
- Jorge Cajas (Universidad de San Carlos de Guatemala)
- Dhaby Xiloj (Universidad Rafael Landivar - Quetzaltenango)
- Rene Alvarado (Universidad de San Carlos de Guatemala - Chiquimula)

In this first leg (not the only one . . . I hope) GuateJUG team went from the center of the country to the east and later to the west, like in this ugly map:

GuateJUG Tour

Covering the following topics

First conference

Place: Universidad de San Carlos de Guatemala - Guatemala City
* Java 8: Functional programming principles (me)
* 20 years of Java (Wences Arana - DebianGt)

Second conference

Place: Universidad Rafael Landivar - Quetzaltenango
* Source code control with Git - (me)

Third conference

Place: Universidad de San Carlos de Guatemala - Chiquimula
* HTML5 applications with Java EE 7 and AngularJS - (me)
* Android 101 - (Mercedes Wyss - GuateJUG)

Random photos:

Some years ago I did a similar tour with the FLOSS community in my college years. The sensation of going back to the same cities, hanging out with good old friends and other new friends is always pleasant.

Every time anyone asks me why I do this, I have the same answer "why not? At least each time I know the country a little bit more :)".

With this I'm in the best mood for Java Day Guatemala and most importantly to attend Java One 2015 willing to collaborate with some User Groups activities, I hope to see many of you in both conferences.

GuateJUG Tour 2015

26 September 2015

In GuateJUG we arrived to our fifth year, consequently I must start this post with a "thanks to everybody that made this possible (developers, advocates, supporters, sponsors, friends, family, and other-UG)".

This year has been a special year for GuateJUG and Java in general, in February we were very excited about being featured at Java Magazine (a little step for .gt developers, a great step for our user group), and as every inch of the web knows 2015 is the year of the 20 years of Java, another anniversary that came in hand to believe that 2015 is the year of Java reborn.

Hence, at GuateJUG we added to our regular activities a conference circuit called GuateJUG Tour.

GuateJUG Tour

As many of our activities, this is some kind of spontaneous tour, hence we'll be adding dates as new cities are confirmed (Guatemala is a small country so it's possible to confirm in very short notice).

The tour started on September 26 and I'm confirmed as a speaker in three cities, so I'll be collecting my slides in this post for quick reference :).

Java 8: Más funcional que nunca

Java 8: Más funcional que nunca from Víctor Orozco

Git 101 (with Java)

Introducción a Git (Git 101) from Víctor Orozco

Aplicaciones HTML5 con AngularJS y JavaEE

Inciando con AngularJS y JavaEE 7 from Víctor Orozco


08 August 2015

Hi, If you're reading this probably you are one of my few twitter followers, so thanks for your visit. For those that don't know me I wanna share a little story with you :).

As an IT guy, I started my main website/blog El abismo de tux (in spanish) in 2006 during my college years, mostly as an experiment that grew as the blog phenoma exploded, gaining great experiences like:

  • Being featured in a nation-wide newspaper;
  • Being invited to morning talk shows due my tech/politics rants;
  • Getting opportunities like my first job using my tech blog as a "certification" of my expertise in Unix and Java.

So . . . what's the point of starting a new website in 2015?

I consider my old blog as a chronological reflection of my life experiences, like my stupidity at college, my involvement in FLOSS communities, my musical taste, EVERYTHING that I could talk about technology, politics, society, academia, the life in the countries where I've lived (Guatemala, Brazil), and in recent years using it as an scratchpad of my day to day coding.

During this year at Nabenik, while I was struggling to improve my coding skills I noticed that most if not all the times I search for tech content exclusively in English, and I've noticed also a debacle on my readers due the change of focus at my old blog. because I simply killed some topics that I don't care anymore (politics, Free Software as a social movement, the country where I live), so I've decided that is time to put the "Tux Abyss" in mantainance mode.

Killing a website is more difficult than it looks, specially a website that has been online since 2006, however at this point of my life/carreer I'd like to share deeper facts and knowlege about fewer topics, specially:

  • Gentoo Linux and Unix in General
  • JVM Languages (Java, JavaScript)

That's why I'm welcoming you to "The J*", the name is inspired in three facts

  • I write code in JVM Languages (specially Java) and JavaScript on daily basis
  • I was born in June
  • The second name of a child that means a lot to me starts with J

For this site and considering my past experiences with Wordpress and Jekyll . . . I decided to try JBake, mainly because I'm familiar with Gradle, and for me is easier to write posts in markdown. Also because I fell in love with the results at Vert.x and Hawkular to name a few.

At this time this blog is hosted at GitHub pages being built by Travis-CI with gradle, saving me hosting costs.

I think that starting a web site in the AOL 2.0 era is some kind of digital rebellion, so please grab a seat and feel free to drop comments :).

Older posts are available in the archive.