Thursday, July 21, 2016

From Xcore/ecore to OmniGraffle

Some years ago I wrote a small tool for creating OmniGraffle UML diagrams directly from Java source code. Visualizing Java is nice, but since I'm often use ecore/Xcore to define my models, I wanted a tool to also nicely visualize EMF based models.

I have now extended my tool, j2og, to also create UML class diagrams from ecore or Xcore models. Below you see a (manually layouted) version of an automatically generated diagram of the ecore library example.

j2og does not layout the diagram, since OmniGraffle provides some nice layout algorithms anyway. When creating the diagram, you can tweak the output with several settings. For example

  1. show or hide attribute and operation compartments
  2. show context, optionally grayed out -- the context are classifiers defined in an external package
  3. show package names, omit common package prefixes etc.
  4. and more

Note that besides OmniGraffle, you can open the diagrams with other tools (Diagrammix, Lucidchart) as well. See the j2og github page for details. You can install the tool via update site or Eclipse marketplace link.

The following image (click to enlarge) is the result of exporting a large Xcore model defining the AST of N4JS, a statically typed version of JavaScript. I have exported it and applied the hierarchy layout algorithm -- no other manual tweaks were applied. Of course, this diagram is probably too large to be really useable, but it is a great start to document (parts) of the model. Well, in case of an AST you probably prefer using an EBNF grammar ;-)

PS: Of course you could use ecoretools to create an UML diagram. I usually need the diagrams for documentation purposes. In that case, OmniGraffle simply is so much better since it is easier to use and the diagrams look so much nicer, (sorry, ecoretools).

Monday, March 14, 2016

N4JS: Emphasising the Java in JavaScript

For the last three years, I have worked at NumberFour leading a team which creates a type-safe extension for ECMAScript (aka JavaScript) called N4JS. I'm happy to tell you that yesterday, NumberFour announced N4JS to go open source (press release, PDF). You can find the project home page at:

N4JS bridges the strengths of JavaScript and Java; the result is a typed JavaScript superset that is dynamic, flexible and type-safe. This first release mainly targets Node.js developers, enabling them to create large and maintainable projects.

The idea behind N4JS is to bring Java's type system to JavaScript and to make JavaScript really as type-safe as Java. This is probably also the main difference to TypeScript, which focuses on ease of transition from untyped to typed JavaScript. I don't want to go into the details here, we already created a longer description of the differences to TypeScript

As a result, N4JS brings many features known from Java to ECMAScript. It is based on ECMAScript 2015 (almost all new features are supported, missing things will be added soon). It therefore supports classes as in ECMAScript 2015 (or Java). It also introduces interfaces with default methods similar to Java 8. Also the generics are quite similar to Java, including support for generic types, generic methods, and wildcards (i.e. use-site variance). We have actually ported the type inference algorithm for type variables introduced with Java 8 to N4JS. Of course we had to adjust the type system to match the semantics of ECMAScript. N4JS therefore also supports union and intersection types or "this"-type (great for builder pattern).

The most notable thing about N4JS is that it not only supports Java's nominal typing but also TypeScript's structural typing. It even supports different variants of structural typing, e.g., only taking fields into account. We also have created a short introduction to this very special feature of N4JS.

There are many other bells and whistles in N4JS such as dependency injection (as in JSR330/Google Guice) or testing with annotations (as in JUnit). And this is only the beginning. With the foundation finished, we are looking forward to adding more sophisticated features such as more program analysis.

Although we call the current release a "public alpha", N4JS is pretty stable already. We are using N4JS internally for over a year now and we have a rather large code base (and more than 70.000 tests!). Of course, there are a lot of known issues (we will migrate them to GitHub issues soon)...

N4JS comes with an Eclipse-based IDE. Our goal is to provide the same IDE support for N4JS as JDT for Java. You can either download the N4JS IDE as a standalone product or use the update site to add N4JS to your Eclipse Mars installation. We also provide an Oomph script to set-up an Eclipse IDE ready for developing N4JS itself. See the GitHub pages for source code and details.

Well, this has become a rather long posting already. I will post more about N4JS, its features and new developments in the future either here or on the N4JS developer blog. Last but not least, I want to thank the people behind N4JS. In particular I want to thank the N4JS team at NumberFour -- you did a great job and I'm looking forward to working on N4JS and other things with you! In place of the many helpers I want to thank Sebastian Zarnekow -- you did an amazing job (readers may want to look at the grammar file to get an idea of what kind of magic he had to add to Xtext to enable that)!

Update (14.3.2016, 19:00): The N4JS Eclipse project proposal is now public!

Sunday, October 18, 2015

j2og -- Java To OmniGraffle, Updated

Three years ago, I wrote a small plugin called "j2og" (for Java To OmniGraffle) which can create OmniGraffle drawings from your existing Java files. At that time, I used AppleScript to create the drawing. Unfortunately this mechanism was a little bit fragile, and somehow it didn't work with new versions of OmniGraffle / Mac OS X. Instead of fixing the AppleScript problem, I rewrote the export.

The new version now directly exports to OmniGraffle files. That means OmniGraffle is no longer needed to export your files to an OmniGraffle drawing. Actually, there are several tools which can open the exported drawings now:
  • OmniGraffle, of course; only available for Mac OS X
  • Diagrammix, via "Import"; only available for Mac OS X
  • Lucidchar, not tested yet; available for Mac OS X and Windows
The overall functionality was not changed. E.g., you can export classes and interfaces with or without attributes or operations, attributes can be transformed to associations, context of classes can be exported as well. The screenshot shows a sample drawing exported from the j2og sources. Since j2og does not layout the drawing (I was too lazy to write an layout algorithm), you have to manually layout the diagram. E.g., use the auto-layout of OmniGraffle as I did for the sample screenshot.

You can either install j2og from the Eclipse marketplace or the update site (see j2og homepage at github for details).

Thursday, March 19, 2015

CfP: Workshop on Methodical Development of Modeling Tools (ModTools15)

Same procedure as last year? Same procedure as every year... but this time it's Australia. This is where EDOC 2015 is located, and this is where the following workshop is located this year:

3rd International 
Workshop on Methodical Development of Modeling Tools (ModTools15)
on the 19th IEEE International Enterprise Computing Conference EDOC 2015

This year, EDOC takes place in Adelaide, Australia. You will find the call for paper and other information at the workshop's hompage: Submission deadline is April 15th 2015.
The workshop Methodical Development of Modeling Tools focuses on procedures and architectural principles related to the creation of software for presenting, editing, or analyzing models.
Software tools for modeling are required for scientific and practical applications of modeling methods and modeling languages in enterprise computing. To test and exemplify new modeling approaches, research prototypes of model editors are required, as well as tools for presenting and analyzing models. To be able to efficiently develop such modeling tools, it is desirable to methodically guide the development of modeling tools, and elaborate procedures to align their design with the conceptualization of new modeling languages and modeling methods.

Monday, February 17, 2014

CfP: Workshop on Methodical Development of Modeling Tools

How time flies... only recently I posted about a workshop (held at EDOC 2013), and today I can announce the 2014 version, held at EDOC 2014. It's the

2nd International 
Workshop on Methodical Development of Modeling Tools (ModTools14)
on the 17th IEEE International Enterprise Computing Conference EDOC 2014

This year, EDOC takes place in Ulm, Germany. You will find the call for paper and other information at the workshop's hompage: Submission deadline is April 1st 2014 (really, no kidding).

Update (10/4/2014): Submission Deadline extended: 2014-04-28 (final extension by the main conference)

Although I'm not working at the university anymore, I still think that a workshop like this is quite important because it tries to bridge the gap between pure scientific research and real world requirements. If you look at scientific conferences, many researchers present tools in order to evaulate there approach. From my own experience I know that often you will find dragons when you try to actually implement these tools. These dragons, once disturbed, may even threaten the whole theoretically nice approach. The workshop tries to give the brave knights---and since you are reading an Eclipse related blog, that's probably you!---fighting these dragons a place to exchange thoughts, methods, and ideas. And, last but not least, it gives you an opportunity to publish about that kind of work (the workshop proceedings are published together with the conference proceedings at IEEE).

Monday, February 11, 2013

CfP: Workshop on Methodical Development of Modeling Tools (MeDMoT'13 @ #EDOC2013)

From my experience, a lot of developers and researchers use Eclipse modeling tools -- or even implement their own modeling tools. So you may be interested in the following CfP for the

Workshop on Methodical Development of Modeling Tools (MeDMoT'13)
on the 17th IEEE International Enterprise Computing Conference EDOC 2013

Submission deadline: 2013-04-15

Having modeling tools available is a central assumption made in modeling research. Software tools for modeling are a prerequisite for any practical application of modeling methods, and research prototypes of model editors are required for developing new modeling languages and methods. It it thus desirable to efficiently guide the development of modeling tools and seamlessly align their design with the conceptualization of modeling languages and their application.
The workshop focuses on procedures and architectural principles related to the creation of software for presenting, editing, transforming, or analyzing models. This covers special constellations related to the use of models and modeling languages, for example, the ability to automatically derive model editor functionality from formal specifications of modeling languages (meta-models, grammars, etc.). In addition, design principles and implementation options for requirements towards modeling tools, such as wizard support for interactive modeling, or alternative ways to display graphical models, are discussed. All kinds of models are included in this discussion, either domain-specific or general-purpose models, as well as graphical and textual models with their related software tooling support. Intended audience are scientists and practitioners, who apply modeling techniques and model-driven procedures, and develop their own modeling tools.
Submissions may document research work on newly created modeling tools, either for prototype or production purposes, as well as work on methodology and architecture of modeling tool development. This may cover model editor software in a narrow sense, as well as any approach for handling, transforming or analyzing models in a wider sense.


Possible topics for submissions are:

  • How can formal language descriptions, e. g. meta-models or language grammars, be consulted for partially or fully automatizing the creation of modeling tools?
  • How can process models of modeling activities be be made a basis for innovative modeling tool development?
  • Which methodical implications are bound to the use of code generation approaches to create tooling support, and how do they compare to runtime interpreter solutions?
  • How can versioning conflicts be handled or avoided during parallel development of modeling languages, modeling tools, and existing model instances?
  • Which role do model editors play as end-user interfaces to control applications ("models at runtime")? To what extent do "models at runtime" blur the border between developing modeling tools and other types of applications?

Contributions to related topics are also welcome.

Read more about this at the workshop's homepage!

PS Some readers may had a deja vu reading this CfP... Well, to be honest we already have organized a similar workshop at CSE 2012. Unfortunately we received too few papers that last time, which may be caused by bad publicity or the combination of the workshop with CSE has been less attractive. So, I'm more than happy now that the workshop has been accepted at EDOC. The conference will be held in Vancouver, BC, Canada -- so I hope this will be more attrative :-) I believe that this workshop is a great oppurtunity to publish certain aspects of a researcher's work, that yet require a lot of effort but are usually not of interest for the research community: namely the design and implementation of the modeling tools, which are then used to proove new approaches.

Thursday, August 30, 2012

Modelitis, or, everything you always wanted to know about modeling

A couple of years ago I left the world of industrial software projects, and joined the academic world of computer science. I already had used models, I knew quite a few things about UML, even had programmed a graphical UML editor using GEF, in other words, I felt prepared to explore the universe of modeling. However, as my former boss and tutor Prof. Dr. Hans-Werner Six correctly diagnosed, I soon got infected with a terrible illness, widely spread in that area: modelitis. Actually, depending on how hard this disease hits you, you may call it meta-modelitis.

A special characteristic of this disease is that the patient himself is more or less unaware of it, actually, he or she often thinks it's great fun. So did I, and frankly, I doubt that I will ever be completely cured -- particularly since there are so many other victims of modelitis doing great and funny things. E.g.,  inventing one TLA (for non-modelitis afflicted readers: three letter acronym) after the other, such as MDA, MDD, DSL, MOF, ATL, EMF, ETL, GEF, GMF, OCL -- you name it. I was no better, I even tried certain variations, such as CAT, GEF3D, or Mitra (my own M2M-transformation language).

In the end, I wrote a book about my modelitis. It's title is

"Computerunterst├╝tzte Modelltransformationen. Modellierungstheorie, Konzeption und Visualisierung im Rahmen modellgetriebener Entwicklungsverfahren"

Yeah, it's a rather long title, and you can figure from its length that it's a real hard-core academic work (aka dissertation). Translated to english it's

"Computer-assisted transformations. Modeling theory, design and visualization in the context of model-driven development"

You may have noticed that "computer-assisted transformation" is abbreviated to CAT :-) This is the abstract of my thesis (in english, the thesis itself is written in german):

In the context of model driven development, fully automated model transformations are used to release developers from recurring and error-prone tasks, thus improving efficiency. However, these transformations cannot always be applied, or at least not without introducing new problems. By exploring semi-automated techniques, this work tries to eliminate some of these problems while preserving the positive effects. First, a modeling theory is developed for describing model driven approaches. This theory is used to analyze a typical problem of these approaches, the so called "semantic gap". The need to add semantics when transforming models is identified as a fundamental problem. Computer-assisted transformations are a semi-automated approach, which allow to combine manual transformation parts with automated parts. In order to define and execute this kind of transformations, a model-to-model transformation language called Mitra is designed. Due to the manual nature the approach, the models need to be visualized and a user- interface is to be provided. The basic idea of the proposed solution is the visualization of graphical models by means of two-dimensional diagrams projected onto planes in a three-dimensional scene. This permits not only the visualization of inter-model relations, but also semi-automated transformation of model elements with computer-assisted transformations employing the common drag-and-drop interaction pattern. With GEF3D, a framework is presented, which enables the combination of existing graphical editors in a three-dimensional scene. Three well-known transformation problems, among others the robustness analysis, are solved with the created tools in order to evaluate the proposed concepts.

If you are suffering modelitis as well, or if you are a otherwise seriously addicted to meta-modelling, you may download the PDF at the library of the FernUniversit├Ąt in Hagen. If you want to look at some results, you may have a look at GEF3D, or at Mitra2. The latter is work in progress, as I'm currently migrating from Xtext 1.x to Xtext 2.x (the need for migrating from one version to another is one of the symptoms of modelitis, e.g., victims will enthusiastically tell you how they migrated their models from UML 1.x to UML 2.x ;-) ).

Since I doubt that too many of you will actually download and read the PDF, I will only repeat a snippet from my acknowledgement section:
 A big thank you to the Eclipse community! 
Without the great software, particularly in the modeling project, and without the great support (when I had problems with the great software ;-) ), I wouldn't had been able to actually implement software realizing my ideas and by that proof the concepts. And it wouldn't had been so much fun.

(All pictures taken from the dissertation)