Migrate Projects to Maven/Subversion Development Environment

This post contains a number of experiences made while making a Swing desktop application modular using Maven and Subversion.

Install a subversion server (for instance on ubuntu server or on windows using VisualSVN)
Set up one subversion repository for all your projects. (Read more on setting up SVN repository structure)
Install a subversion plugin when using eclipse.

The steps described here should allow a synchronized development on multiple systems and multiple IDEs. I have tested with Mac OS X (10.6.3), Microsoft Windows 7 and Microsoft Windows Vista. I used the IDEs Eclipse (Ganymede) and NetBeans (6.9 RC1).

I found eclipse more reliable when it comes to refactoring than NetBeans IDE.

Link all your source folders to one project in eclipse

Create a new project in eclipse named ‚AllSource‘. Do right click and Link Source Folders for all your projects.

Important: for this, the source folders in all your projects have to have different names (I guess eclipse by default creates them as ‚src‘ or something; I always rename the source folder to the name of the project, that it becomes easier to link the source folders of different projects).

Organizes Classes in Packages

Try to normalize the classes in distinct packages. Do not have the same package name in two source folders. Try to use as few as possible different root folders per source folder. Make the root folders distinct.

is not desirable. Change to:

Furthermore, try to aggregate classes, which have similar dependencies in distinct packages.


Create PlugIn Projects for the OSGi Bundles

Create Plugin Projects in eclipse and link them to Maven (see OSGi + Maven + Declarative Services + eclipse).

Do not forget to change the pom.xml file: let the OSGI-INF and META-INF folders be copied and NOT generated by Maven (makes life easier in the long run)


If you use a Maven repository, you can also specify the distribution location in the pom.xml file (can be directly inserted under the project element – don‘t insert under build):

<name>Internal Releases</name>
<name>Internal Snapshots</name>

Open Mac OS X Finder or Windows Explorer and Copy and Paste the source files form the old source folders to the bundle source files (src/main/java)

Right click project in eclipse and select refresh. It might be also be a good idea to change the encoding of the project to UTF-8.

You should now be able to deploy the artifact to the Maven repository.

Source Code Versioning

Before adding the sources to version control, it is advisable to delete all old version control meta-data in the source directories. If you are using unix or Mac OS X, you can check, if there is any meta-data using the command:

MacBookMX:de mx$ find . -type d -name .svn

If you want to delete the meta data, use the command:

MacBookMX:de mx$ rm -rf `find . -type d -name .svn`

(more info here)

In eclipse, you can also add the project to version control, and then disconnect the project and choose to delete all .svn meta-data to achieve the same end.

Also, make sure the „target“ folder of maven is not uploaded (Ignore eclipse project resources for subversion):


You can also create global ignore pattern for the following files (Setting up global ignore for eclipse).
However, although this makes your modules more independent from the eclipse ID, it also complicates importing your projects to eclipse (you will need to configure all the project settings every time).
I therefore do not exclude these files.



(This must be done for all involved IDEs on all involved systems)

You can connect the project by right clicking the project and selecting Team / Share Project.

Make sure to use the layout using trunk/branches/tags.


The source files should be in the folder „trunk“ (and there should be no .project/.settings/… folders)


Deploy Initial Release

There are some ways to automate the release process (Tutorial showing how to setup subversion to use with Maven). However, I chose to do it manually, as I did not find yet integration for the release plugin for eclipse IAM.

Fist, change the version numbers of your OSGi bundle and the Maven project. I first use the version „0.0.1“ for new bundles.

You can right-click the project, select Maven 2 / Deploy Artifact.

This should upload your project to the local Maven repository:

Now you should also create a tag in subversion. Just right click the project and select Team / Tag …

Use the version number as the tag. And leave „start working in the tag“ UNselected.


Now you have a snapshot of your sourcecode at the time of the release:


Now do not forget to change the version to a new snapshot release:

Name your OSGi bundle: 0.0.2.SNAPSHOT
Name your Maven project: 0.0.2-SNAPSHOT

You can deploy the snapshot version to the Maven repository as well.

However, when specifying dependencies between projects, I would always recommend to use release versions of the plugin (this allows to trace back various builds and reduces the workload to prepare a release as releases should not contain any -SNAPSHOT) references.

So it is good to have a release version available from the very beginning.

Import to other IDEs

Import into eclipse can be accomplished by File / Import and „Project from SVN“. Use the HEAD revision and let eclipse search for project settings in the directories.

The import in eclipse is so effortless that I would suggest to initially download the projects using an eclipse IDE. Then, the projects can be very easily opened in NetBeans IDE (just use open project).

Setting up Dependencies between Modules

When setting up dependencies, always try to use non-snapshot version:


Linking to Third Party Libraries

Many third party libraries do not provide Maven metadata. You can upload these libraries as JARs to your own repository (eg Nexus, Artifactory).

These libraries then still need to be installed into the OSGi runtime. There is a Maven plugin, which is able to automatically package the Maven dependencies into OSGi bundles. However, the generated MANIFEST.MF would only be available AFTER the Maven build. So the bundle might not run in eclipses runtime environment (which is desirable for debugging purposes).

Another option is to include the libraries directly in the bundle you are working with. I copy the jars in the src/main/resources folder. From there, I edit the MANIFEST.MF and add the libraries in the scr/main/resources folder to the Classpath of the plugin. Make sure that also the „.“ is added to the classpath)


These will make the bundles available at runtime for the bundle (as the JARs from the resources folder are copied to the root of the bundle, and by „.“ the root is added to the classpath of the bundle).

This works fine in eclipse but if you want to use the projects effortlessly in NetBeans, the pom.xml must still define the dependency to the library. For this, the library must of course be available as Maven project (either from a public repository or uploaded to your own repository as shown above). It might make sense to change the scope to „provided“ (see Maven Doc Dependency Scope)



Eclipse/PDE integration of the Maven Bundle Plugin
Maven Eclipse Plugin: Allows to generate eclipse project files and convert eclipse projects to maven
Extensive tutorial on how to develop plugins for eclipse using maven and eclipse PDE

Installing Maven under Windows Vista

It is fairly easy to set up Maven under Windows Vista, following this Excellent tutorial.

However, I had a few difficulties figuring out some steps, which are described in the following:

How to set your environment variables under Windows Vista? Right click your „Computer“ and select „Properties“ – here click on „Advanced Settings“.

The Systems Properties Dialog should open. Go to the tab Advanced and you can click environment variables at the bottom.


On my system, the variables where:
JAVA_HOME –> C:\Program Files\Java\jdk1.6.0_20
M2_HOME –> C:\Users\mroh004.COM\Documents\Applications\apache-maven-2.2.1

Setting up a Maven Repository using Nexus and Artifactory

Two popular alternatives are Nexus and Artifactory and it is highly debated, which is the best option to use. Nexus seems to have the smaller footprint on the server‘s memory.

Installing Nexus

Download and Install Apache Tomcat (as Windows Service)

You can deploy Nexus as a War file.

The war file is at the bottom of the downloads page http://nexus.sonatype.org/downloads/.

Configuring Nexus

Follow some of the post install steps. Eg change the passwords (also for deployment user etc) and reindex the repositories.

I would also suggest to change the default location, in which repositories are kept. On my Windows 7, Nexus used a very strange folder in systems32/systemprofiles or so. The default location can be changed by changing the file „plexus.properties“. It can be found in the Apache Tomcat folder, under webapps/ nexus / WEB-INF.


The parameter, which needs to be changed is nexus-work:


I would disable anonymous access.

Set up a user „download“ and assign the role „Repo: All Repositories (Read)“. Set a password for the user. eg „example1“.


Configuration of Maven Client to Download Repositories

You will need to change the settings.xml file for your Maven clients.

(1) You need to specify the address of your repository:

<?xml version=“1.0” encoding=“UTF-8”?>

<settings xmlns=“http://maven.apache.org/SETTINGS/1.0.0






HINT: Do not forget the activeProfiles element!

(2) Since we have disabled anonymous access, we will need to specify the authentication information for the servers. Nexus allows you to store the password in an encrypted form. However, here we just specify the passwords in plain text.
The servers element can be inserted below or above the profiles element:


NOTE: In NetBeans the user settings file can be easily edited by right-clicking the „Project Files“ node in a Maven project and selecting „create settings.xml“.

You now should be able to use repositories from your repository in your local projects.

Configure Repository for Third Party Jars

Often, a project depends on a jar, which is not available on the public Maven repositories. In that case, it is the most robust approach to define this library in our Nexus Maven repository.

Nexus creates a special repository „thirdparty“.

We need to add this repository to our local settings.xml as well:


And add another entry to the servers element:


Now we can log in to the Nexus web interface. Under repositories, we find the 3rd party repository. We can go to the tab „Artifact Upload“

You need to select the „GAV Definition“ GAV Parameter if you do not have a POM for the jar. You must come up with your own group etc meta data. Then you can select the jar file that you want to upload.

Configuration of Maven Client to Upload Projects

To upload to your repository, you have to specify an additional element in your projects pom.xml: the elements <distributionManagement> can be inserted directly under the project element (and NOT in the build element):

<name>Internal Releases</name>
<name>Internal Snapshots</name>

You will further need to specify the authentication, as only the deployment user is allowed to deploy artifacts to the repository. For this, you will need to add new server elements to the settings.xml file:


Now you can deploy the project by using the command „mvn deploy“.

See also this blogpost regarding distributing projects to a local repository.

Installing Artifactory (Incomplete)

You can also check out this screencast.

Download and Install Apache Tomcat (as Windows Service)

Download Artifactory Open Source Maven repository manager

Deploy the artifactory.war to the Tomcat Application using the Tomcat manager.

Click on /artifactory and wait for the application to load

login with admin / password

Go to admin and specify the server name (eg „mymavenreps“)

Go to security and require authentication.

Create a user „maven“ with a password of your choice.

Go to repositories.

Local and remote repositories should already been set up.



You can go back to the home tab and automatically generate settings for Maven there:


Go to your local maven repository:

MacBookMX:workflows mx$ cd ~/.m2/

Create a new file settings.xml there and paste the generate Maven settings.

For me, the generated settings file still defined the server as „localhost“ – I changed it to the actual server where I installed Maven.

Further, you need to specify the user login information of the user you have created before.

How your local jar files can be uploaded to the server is described in this screencast.

Further Resources

Blog post on why to use dependency management. And another blog post on the same topic.

Tutorial Setting Up Maven Repository under Windows using Artifactory

Good overview of how to configure repositories on the client side (Codehaus)

Build and Dependency Management: Lean Maven

Modern applications depend on a great number of third party libraries, or libraries, which are developed as part of a project or in other places in the organization. Managing these dependencies is a daunting task. If, for instance, in the scenario depicted below, the module 1 must be added to a project, module 2 to 4 must be downloaded from respective locations and added to the project.

Tools, which support transitive dependencies between modules, simplify this task. All what needs to be done is to specify an identifier for module 1, and the dependency management system will take care of downloading the modules and adding them to the project, in which module 1 is required.


Two popular tools, which support the management of transitive dependencies are Maven 2 (Porter and Ching, 2009) and Apache Ivy. Comparison between Apache Ivy and Maven 2 are available at the Apache Ivy and Maven 2 project websites. Generally, it is often said that the documentation of Maven is rather weak. An impression I share from my work with Maven so far. Also, the integration with eclipse is also said to be weak. However, using the Eclipse IAM plugin, I did not have many problems. Maven‘s integration with the NetBeans environment is strong.

It is a quite popular approach to use Maven with OSGi (Walls, 2009). Also, Maven is widely distributed and provides plugins for many technologies. However, I believe that Maven is great once you have figured out how to use it, but not so great if you still have to figure it out. I therefore follow an approach of Lean Maven. I create a simple template for a module and will reuse this template for all modules. This template tries to make as little use of Maven as possible.

The template consists of

  • A src/test/java, src/main/java and target/classes folder following the standard Maven project structure.
  • The pom.xml files declares dependencies to other modules.
  • A META-INF/MANIFEST.MF file to define run time dependencies.
  • A OSGI-INF folder containing XML files declaring the services the module offers.
  • I use the bundle plugin to copy the MANIFEST.MF file, rather than using Maven 2 to generate the bundle definition file
  • A build.properties file, allowing to run and debug the file using the eclipse Plugin Development Environment (PDE)

It is described as part of the java modularity tutorials, how to create these bundles in eclipse (OSGi + Maven + Declarative Services + eclipse).

This template offers the following advantages:

  • It is easy to add new modules from third parties to the project, by, for instance, using the Maven repositories.
  • Maven can be used to deploy JAR files of the project to the local and other repositories. From there, the project itself becomes a module, which is easily reusable (with all its dependencies) by other projects/modules
  • The JAR files, which are deployed to the repository, can easily be installed into an OSGi container.
  • Only minimal configuration is necessary for the individual modules.
  • Configuring the OSGi bundles (import and export of packages) goes hand in hand with declaring dependencies via Maven. The whole process is supported by the eclipse IDE.
  • It is very easy to synchronize the projects between the eclipse IDE and the NetBeans IDE. To open a project, which was created in eclipse, with NetBeans, one simply needs to Open the project using NetBeans standard open project dialog.

Using the template has the following disadvantages:

  • Often, adding a dependency from a public repository leads to the addition of many dependencies from the referenced project, which are not necessary in the current project. This can be mitigated by maintaining a custom Maven repository.

This post is part of a series of posts on java modular software development.

Java Modularity Tutorials (OSGi, Declarative Services and Maven)

I provide here a number of tutorials to set up and work with a development environment as depicted below:

Why I chose to use OSGi with Declarative Services is discussed in the post Modular Software Development OSGi, Spring DM, iPOJO and Declarative Services.
Why I chose Apache Maven is described in the post Build and Dependency Management: Lean Maven.
After a long struggle of figuring out how to develop OSGi WITH Declarative Services with Eclipse AND NetBeans to run in Apache Felix AND Equinox, I have created a few tutorials including many screenshots showing possible ways to work with these technologies.
Part 1: OSGi + Maven + Declarative Services + eclipse
Part 2: OSGi + Maven + Pax Runner + Apache Felix + Equinox

Part 3: Creating Runtime Environments for OSGi Declarative Services in Eclipse

Part 4: OSGi + Maven + Declarative Services + Apache Felix SCR + NetBeans

Part 5: Implementing a Service Client for OSGi Declarative Services

Part 6: Migrate Projects to Maven/Subversion Development Environment (also see Subversion and Maven)


See also this excellent tutorial on building OSGi bundles with Maven, including integration tests, etc
Java Modularity: Jigsaw and OSGi
Enterprise Maven Stack (Commercial)

Modular Software Development with OSGi, Spring DM, iPOJO and Declarative Services

Software development is inherently different from other forms of creation. Traditional production requires a constant inflow of fresh resources, for instance energy and raw materials. Creating movies requires the combined effort of many skilled people and even writing a book cannot be accomplished without a significant investment of time and effort, as every page, every sentence has to be written anew.

To develop software is different in that re-usage, in theory, is a simple recomposition of bit and bytes. If there is one open source software, which is able to calculate differential equations, and another, which is a powerful word processor, creating a software, which can accomplish both, should be fairly easy.

However, generations of developers have experienced that the theoretical reusability of software is subject to many practical barriers: among these are different operating systems (not every software written to work on MS Windows can be used on Linux), different programming languages (a program written in Java can only with great difficulty be integrated with a software written in C), or poor design (also known as „spaghetti code“).

A first natural approach to create reusability was the idea of the function. A program was seen as a number of functions, which interact with each other by calling each other to delegate tasks. Even parts of programs can be reused by other programs by using their functions. However, even if programs organized by functions are more manageable than code structured by Go-Tos, functions leave developers of software with many problems. These are, for instance, related to extensibility: how is it possible to extend the functionality of a function without changing it or copying the code. Further, programs organized by functions face difficulties in dealing with persistent data, which is kept in arrays and other data structures. An approach often seen in functional programs (take Pascal for instance), is that data is kept in „global variables“ which keep all the data, which is accessible by all the functions of the program. Soon one looses the overview of which function access which part of the data in which way.


As one way to deal with this problem, the concept of object oriented programming was proposed. Objects are instances of classes which specify a given set of functions as well as data, which is accessible by these functions. Ideally, the outside world of the object should only access the data of the object by its functions or so called methods. Furthermore, one class can inherit another class, which means inheriting all its methods and data definition, this makes it easier to extend objects. Other applications, which want to reuse parts of other application, can reuse their objects.


However, this re-usage of objects stills bears a number of problems, which become apparent in the context of large applications:

  • Extending classes, which define data, makes it still difficult to track, which method changes which data. And often, in OO, class and implementation inheritance was emphasized instead of interfaces (D’Souza and Wills, 1998)
  • Complex application usually depend on extensive libraries (many of which are open source). These libraries consist of hundreds of objects. The application usually has to load and manage all these objects, which raises issues of performance.
  • Furthermore, dependencies between objects can be manifold and it is difficult to manage and govern all interactions between the objects.
  • OO is also said to provide a too small granularity for effective management of the code (D’Souza and Wills, 1998)
  • The usual approach is to register and load all objects of all attached libraries at the start of the application. It is difficult to add and remove objects from the application during the application is running.
  • Furthermore, the classes which are available to the application during runtime must be specified when the application is started. Although classes like ServiceLoader (ServiceLoader) allow to create instances of objects from service interfaces, they do not support dynamic loading and unloading of services. That is, all the services which are going to be used by the application must be loaded when the application starts (be part of the classpath).

We will look at two complex and popular application, which have attempted to deal with these challenges. The integrated development environments (IDE) eclipse and NetBeans. Eclipse is a software widely known for its extensibility. The attentive user will notice that the eclipse IDE does not have to be restarted after a new extension has been installed. To accomplish this, eclipse uses the OSGi framework, which relies on a standard established by the OSGi Alliance. Among the member of this not-for-profit organization are leading IT companies like SAP AG, IBM Corporation, and Oracle Corporation (as of Mai 2010). The OSGi framework enables „the creation of highly cohesive, loosely coupled modules that can be composed into larger applications“ where „each module can be individually developed, tested, deployed, updated, and managed with minimal or no impact to the other modules.“ (Walls, 2009, p. 15). This is usually not possible in most object-oriented environments, where, possibly, every object can relate to any other object. In OSGi, an object can only rely on objects within a particular modules or a strictly limited set of classes from other modules. Furthermore, objects can be exposed as services, which can used by objects in other modules.


A very similar approach has been pursued by the NetBeans IDE. Instead of the OSGi framework, NetBeans uses so called NetBeans Modules. These modules can be dynamically loaded and unloaded during the runtime of the application and also allow to restrict access to the classes, which they contain (Boeck, 2009). A major difference is that NetBeans modules implement a special from of file system, where file resources can be shared between modules. We see this feature as introducing a lot of complexity, which can complicate the development of large applications of many modules. Furthermore, whereas the OSGi initiatives is driven by many of the major players in IT today and used in many software products, NetBeans modules are not widely distributed outside the NetBeans community.

Both OSGi and NetBeans Modules do not solve the problem that it is actually the objects, which are required by other objects and not the modules. These leads to code as the following, sometimes referred to as „check-then-act“ pattern:

If Object is Available then
let the object do something.

This is not very safe code in a multithread environment (which most applications nowadays are), as while the first line is executed, the object might be available, but it might be unavailable at the second line due to changes made in a second thread. Furthermore, usually more code needs to be added

Dynamically loading and unloading objects still raises a number of challenges. There are a number of related approaches to deal with these:

  • OSGi Services
  • Spring Dynamic Modules/ OSGi Blueprint services
  • Apache iPOJO
  • Declarative Services

Many contemporary attempts to improve the modularity of Java applications aim at turning as many objects as possible into POJOs (Plain Old Java Objects). The basic idea behind POJOs is to assure that the classes are reusable in another context. For instance, if the same class is used in an application, which uses neither OSGi nor services. For instance, these classes can be reused by NetBeans modules, which were mentioned earlier.

POJOs also help to achieve class normalization in order to increase the cohesion between classes and minimize the coupling (Ambler and Constantine, 2000).

OSGi Services rely on objects from the OSGi environment. Thereby, using these services in objects of the application, makes these objects depend on the OSGi environment (They cannot be reused apart from it). Using these services usually requires to write lots of code (Presentation: Bartlett, 2009).

Apache iPOJO is a framework which allows to specify reusable services. It uses reengineering of compiled byte code of Java classes to assure that references to other objects are always satisfied. The necessity of an additional build step and the change of the bytecode leads to that these objects are, despite the name of the project, not always reusable in the same way as classes supported by declarative services and Spring DM.

Spring is a very popular dependency injection framework. Such frameworks allow to declaratively set the properties of objects, for instance by specifying the value for properties in an XML file. A major advantage of such an approach is that many objects can be preserved as POJOs. However, it leads to many objects (or beans in Spring terms) are placed in a single namespace (Presentation: Bartlett, 2009), which is not desirable for modular applications. Spring Dynamic Modules (Spring DM) allows to break up this single namespace into smaller-grained „Application Contexts“, which can specify, which particular objects they share (import and export) (Walls, 2009). Spring DM is part of the OSGi compendium R4.2 as Blueprint Service). However, although it is part of the standard, there is only one framework (the Spring Framework), which implements this standard). However, Spring DM requires to add a large number of modules to the application, which can take as much space as 2 MB (it is also possible to use a light version, which only takes around 600 kb). Furthermore, Spring DM only allows to create one Application Context per OSGi module. Also, in Spring DM the objects which use services by injecting references to them to their fields cannot be aware whether these services are available or not as Spring DM connects the fields with a Proxy, which is always available (But it is possible to implement Service Listeners). Spring DM has the most advanced dependency injection features such as constructor injection (Presentation: Bartlett, 2009). However, Spring DM does not load the beans lazily.

Declarative services (DS) are part of the OSGi standard specification (Version 4.0, OSGi Compendium specification, section 112). A framework, which supports this specification is called a „Service Component Runtime“ (SCR). This SCR creates objects and exposes them as services, manages the life-cycle of these objects, and manages the dependencies between these objects. This SCR is an independent module, which supports other modules. Every DS is also a normal OSGi service registered with the component. But these services are only registered with the bundle but not actually loaded. The services are only loaded if some component tries to consume the services. This behavior is called „lazy“ loading. This behavior can save significant startup time for applications, especially by saving time need for loading classes by the class loader (Presentation: Bartlett, 2009). DS only requires around 200 kb of additional space.

It must be noted here that the use of the term component is different in the context of above technologies than traditionally. Components or component frameworks (Ambler and Constantine, 2000a; Ambler and Constantine, 2000b; D’Souza and Wills, 1998; Malveua and Mowbray, 2001) understand components as entities with greater granularity. As such, they are closer to the concept of modules described above.

All these frameworks have in common that they decouple the object from the service. The object is not responsible anymore for managing the services. This is done by a surrounding component (or bean …) as it has been found that code to manage services can be quite repetitive (and error prone).

The following scenario shows a component, which requires a certain service, in order to function. This service is offered by another component. If the first component must be activated, the service management framework will arrange for both components to be activated. If there was no component being capable of providing the service, the first component would not be activated. This scenario could, for instance, be implemented with Declarative Services in OSGi.


As all the frameworks work with OSGi services, they work together flawlessly. So, for instance, one module can use Declarative Services while another uses Spring DM; they will still be able to communicate with each other.

Further frameworks to decouple services are, for instance, Peaberry for Google Guice (http://code.google.com/p/peaberry/) or the NetBeans Lookup API.

With ubiquity of the Internet in many areas of our every day life, applications are faced with the requirement of reacting to a dynamic environment. Technologies such as OSGi and NetBeans Modules allow to design applications, which can be dynamically changed during run time. Management of such dynamic environments is a challenging tasks from an implementation point of view. Spring DM, Declarative Services and iPOJO are powerful frameworks, which allow to reduce the required work and involved risks in dealing with appearing and disappearing services.

Implementing a Service Client for OSGi Declarative Services

This post is part of a series of posts on java modularity.

Summary: I show in this tutorial how we can create a simple component using the eclipse IDE, which calls the services, we have created in the previous parts. Again, the component is deployed as Maven package and can be tested in an eclipse runtime environment and an Apache Felix instance.

Create Plugin Project and link to Maven

Create a new Plugin Project „ServiceClient“ in eclipse. Make sure to follow Maven conventions in the project settings.




Right click your ServiceClient project and select Maven 2 / Convert to Maven Project



Add a dependency to the ServiceDefinition in the Maven pom.xml file.


Also add the following dependencies (either using the dialog or directly in the source of the pom.xml file):


Make sure to make your project compilable by Maven by adding the manifest plugin and the OSGI-INF resource to your pom.xml file:



Add the package with the service definition to the MANIFEST.MF file:


Also add org.osgi.service.component


Create Interface and Implementation

Create an interface ServiceClient


We will not add any methods to the interface.

Create a class ServiceClientImplementation:


Add the following simple implementation (note that none of these methods are required for the interface).

import java.util.Vector;

import org.osgi.service.component.ComponentContext;

import de.mxro.osgi.serviceClient.ServiceClient;
import de.mxro.osgi.serviceDefintion.Wisdom;

public class ServiceClientImplementation implements ServiceClient {
public Vector<Wisdom> wisdom = new Vector<Wisdom>();
protected void activate(ComponentContext ctxt) {
out.println(“ServiceClientImplementation activated!”);
for (Wisdom w : wisdom)
out.println(“Connected to wisdom: “+w.getWisdom());
protected void deactivate(ComponentContext ctxt) {
protected void bindWisdom(Wisdom wisdom) {
out.println(“Wisdom was set: ‘”+wisdom.getWisdom()+“‘”);
protected void unbindWisdom(Wisdom wisdom) {
out.println(“Wisdom was unset: ‘”+wisdom.getWisdom()+“‘”);

IMPORTANT: Please note that the activate and deactivate methods do not have to list the parameters ComponentContext. This is actually a bad practices as it ties this object to the osgi framework class ComponentContext. But here it helps to have a look at the variable. It can, for instance, be used to get a reference to the corresponding service for this component. Activate and deactivate could be implemented as follows:

protected void activate(Map<String, Object> config) {…}
protected void deactivate() {…}

Bind and Unbind:
When a service is dynamically replaced, first the bind method for the new service is called and then unbind method for the old service. This can be mitigated using AtomicReference (Presentation: Bartlett, 2009). For instance in the bind method:

Define the Declarative Service

Add the folder OSGI-INF to your project and add a component definition:


Add the ServiceClient as provided service:

In the Services Tab, also add a referenced service by using the Wisdom interface.

Select the cardinality 1..n and the policy dynamic.

1..n means that the ServiceClient component will only be activated if there is at least one Wisdom service available. However, the component is also able to deal with more than one Wisdom service. Other options would be 1..1, which requires one service or 0..n, in which case the component would also be activated, even if there is no Wisdom service available.

The policy dynamic means that ServiceClient is able to deal with wisdom services appearing and disappearing during its runtime. If it was set to static, the component would have to be reinitialized every time a new Wisdom services appears or disappears. This behavior is not desirable, especially when creating the service is an expensive operation (for instance if the service is linked to many other services).

The methods bindWisdom and unbindWisdom, will be called if a Wisdom service becomes available or disappears.


Testing the Service Client

You can test the service by running it in eclipse. Once the OSGi container is started up, you should see the following messages:

osgi> Wisdom was set: ‘A wrong decision is better than indecision.’
ServiceClientImplementation activated!
Connected to wisdom: A wrong decision is better than indecision.

Note that first the bind method was called and then the activate method.

(If the OSGi container is not working as expected, it might help to reinitialize the container using the commands „shutdown“, „init“, „close“ and restarting it.)

We can manually load the bundle, we have created in NetBeans from the local Maven repository.

osgi> install file:///Users/mx/.m2/repository/de/mxro/osgi/serviceProvider2/WisdomProvider2/1.0.0/WisdomProvider2-1.0.0.jar
Bundle id is 21

osgi> start 21
Wisdom was set: ‘A good decision is based on knowledge and not on numbers.’


We can further test to deactivate and reactive our service client manually.
osgi> ls
All Components:
ID        State                        Component Name                        Located in bundle
1        Satisfied                de.mxro.osgi.serviceProvider                        de.mxro.osgi.serviceProvider(bid=12)
2        Satisfied                ServiceClient                        de.mxro.osgi.serviceClient(bid=16)
3        Satisfied                de.mxro.osgi.serviceProvider2.AncientWisdomProvider                        de.mxro.osgi.serviceProvider2.WisdomProvider2(bid=21)

osgi> disable 2
Sent request for disabling component ServiceClient

osgi> Wisdom was unset: ‘A good decision is based on knowledge and not on numbers.’
Wisdom was unset: ‘A wrong decision is better than indecision.’

osgi> enable 2
Sent request for enabling component ServiceClient

osgi> Wisdom was set: ‘A wrong decision is better than indecision.’
Wisdom was set: ‘A good decision is based on knowledge and not on numbers.’
ServiceClientImplementation activated!
Connected to wisdom: A wrong decision is better than indecision.
Connected to wisdom: A good decision is based on knowledge and not on numbers.

Note how the output from the activation method has changed due to the different services, which are available.

Now we can manually stop the bundles, which provide the services (note that it would be a cleaner approach to only work with the services.)

osgi> ss

Framework is launched.

id        State Bundle
0        ACTIVE org.eclipse.osgi_3.5.2.R35x_v20100126
         Fragments=2, 3
2        RESOLVED javax.transaction_1.1.1.v201002111330
3        RESOLVED org.eclipse.persistence.jpa.equinox.weaving_1.1.3.v20091002-r5404
5        ACTIVE org.eclipse.equinox.util_1.0.100.v20090520-1800
6        ACTIVE org.eclipse.equinox.ds_1.1.1.R35x_v20090806
8        ACTIVE de.mxro.osgi.serviceDefinition_1.0.0.qualifier
9        ACTIVE org.eclipse.osgi.services_3.2.0.v20090520-1800
12        ACTIVE de.mxro.osgi.serviceProvider_1.0.0.qualifier
16        ACTIVE de.mxro.osgi.serviceClient_1.0.0.qualifier
21        ACTIVE de.mxro.osgi.serviceProvider2.WisdomProvider2_1.0.0

osgi> stop 12
Wisdom was unset: ‘A wrong decision is better than indecision.’

osgi> stop 21
Wisdom was unset: ‘A good decision is based on knowledge and not on numbers.’


Note how the ServiceClient is notified by the changes through the unbind method. Now that no service is available any more, the ServiceClient‘s status changes to „unsatisfied“ as no service is available any more.

osgi> ls
All Components:
ID        State                        Component Name                        Located in bundle
2        Unsatisfied                ServiceClient                        de.mxro.osgi.serviceClient(bid=16)


Testing the Service in Apache Felix

Using slightly different commands, we can test the same scenario in Apache Felix.

However, we can first remove the dependency on the package org.osgi.service.component. For this, we first remove the ComponentContext ctxt variable from the activate and deactivate methods and remove the import.

import java.util.Vector;
import de.mxro.osgi.serviceClient.ServiceClient;
import de.mxro.osgi.serviceDefintion.Wisdom;

public class ServiceClientImplementation implements ServiceClient {
public Vector<Wisdom> wisdom = new Vector<Wisdom>();
protected void activate() {
out.println(“ServiceClientImplementation activated!”);
for (Wisdom w : wisdom)
out.println(“Connected to wisdom: “+w.getWisdom());
protected void deactivate() {
protected void bindWisdom(Wisdom wisdom) {
out.println(“Wisdom was set: ‘”+wisdom.getWisdom()+“‘”);
protected void unbindWisdom(Wisdom wisdom) {
out.println(“Wisdom was unset: ‘”+wisdom.getWisdom()+“‘”);

Then we remove the package dependency in the MANIFEST.MF.


Make sure to upload the projects to the local Maven repository first (by right clicking / Maven 2 / Locally install artifact).

The project can be started in Maven using the following command (but again note that the path to your local Maven repository will most likely be different).

MacBookMX:bin mx$ ./pax-run.sh –clean –platform=felix –profiles=ds /Users/mx/.m2/repository/de/mxro/osgi/serviceProvider2/WisdomProvider2/1.0.0/WisdomProvider2-1.0.0.jar /Users/mx/.m2/repository/de/mxro/osgi/serviceDefinition/ServiceDefinition/1.0.0/ServiceDefinition-1.0.0.jar /Users/mx/.m2/repository/de/mxro/osgi/serviceProvider/ServiceProvider/1.0.0/ServiceProvider-1.0.0.jar /Volumes/local/online/Programmierung/eclipseMacBook/MavenOSGiTest/ServiceClient/target/ServiceClient-1.0.0.jar

Welcome to Felix

-> Wisdom was set: ‘A wrong decision is better than indecision.’
Wisdom was set: ‘A good decision is based on knowledge and not on numbers.’

The commands to work with the services can be listed using the command „scr help“.