Building the rich client target platform with Eclipse 4.6

Written July 2nd, 2016 by

Few days ago the new Eclipse 4.6 “Neon” was released and as usual when a new update is released, I started to build the target platform required to develop rich client applications (RCP). The unhappy surprise was that the delta pack file containing the plugins and binary launchers for all supported operating systems, is no more available from the downloads page. Few years ago one of the packages useful to build a target platform, the platform sdk, was removed from the downloads. Not a big problem, it can be extracted from the standard Eclipse SDK package removing the unnecessary features and plugins with a bit of work, then adding the delta pack to enable the multi platform export options. Today without the delta pack, this is not possible anymore.

Read more…

Overlay Code with Parallax Propeller and GCC

Written April 7th, 2016 by

Programs that use overlay code were very popular in the era of early home computer systems when the amount of memory was not much, and the ability to load portions of code only when necessary allowed to make programs much larger than the available memory.

With microcontrollers we are in a similar situation with sometimes very complex programs and a rather limited memory, just think for example to the management of the SD card file system or the internet access libraries, that can severely limit the memory available for the program itself, especially if they are to be used simultaneously.

The Parallax Propeller is a microcontroller with 32K of internal RAM memory and uses an external EEPROM to load the code to run at power up. Since only the first 32K of the EEPROM are used for the program it is possible to dedicate the exceeding space of larger memories for data storage. Fortunately with the GCC compiler and the standard tools it is also possible to store portions of the program code to be loaded when necessary.

Read more…

Propeller Game Console

Written February 25th, 2016 by

I’m happy to present my first hardware project, a retro-style video games console with interchangeable cartridges based on the Parallax Propeller microcontroller. For those unfamiliar with this chip, it is a 32bit microcontroller with 32k RAM and 8 cores running in parallel at 80MHz, program code and data are readed from an external EEPROM and has 32 completely programmable input/output pins. The console design uses two of these chips to enhance the performances and allow more complex games than would be possible with just one. A chip is dedicated to the graphics and video output, the other chip is dedicated to the audio output and the main game logic.

DualProp Console

Console Prototype

Technical Characteristics:

GPU:

  • Parallax Propeller P8X32A
  • 32k RAM (internal)
  • 320×240 pixels, 64 colors, VGA output

CPU:

  • Parallax Propeller P8X32A
  • 32k RAM (internal)
  • 128k RAM (external)
  • Stereo Audio
  • Supports two 8-buttons controllers

The chips are connected with two high-speed serial lines for bi-directional communications and two generic data lines one of which is used by the GPU to signal the vertical video synchronization. The game code and data are stored on two EEPROMs, one for the GPU and one for the CPU, soldered on the cartridge board.

I made the first working prototype boards and even if the SDK is not yet ready, I have successfully converted the Abbaye des Morts open source game which uses a fairly complex graphics and requires a lot of memory, a good test for the hardware.

Links:

Experimenting with GIT and Subversion

Written October 18th, 2011 by

In the past few weeks I have experimented with a new workflow based on GIT and Subversion. All our projects are using subversion as version control system, we think that at this time a centralized version control system better suits our needs, however some GIT features maybe very useful to individual developers so I started to investigate on how to take the best of both worlds and integrate GIT in our Subversion-based workflow.

GIT is a distributed version control system, this means that each developer receives a full copy of the repository, so the first think to do is to create a clone of our repository with GIT. Looking around with Google I found that this is a very simple operation:

[marco@bridge git] git svn init -s <svn.repository.url>
[marco@bridge git] git svn fetch

The first command creates a local GIT repository in the current directory and links it to the remote Subversion repository. The second command clones the Subversion repository in the local GIT repository. It may take from few minutes to several hours, depending on the size of the remote repository, but at the end we’ll have a full copy stored in the GIT repository with branches and tags:

[marco@bridge git] git branch -a
* master
  remotes/rel_1_00
  remotes/refactorings
  remotes/tags/pre_rel_1_00
  remotes/trunk

The master branch is the one linked with the remote Subversion trunk folder. Each commit done on the master branch will be sent to Subversion with:

[marco@bridge git] git svn dcommit

From time to time you also need to get the latest changes from Subversion and update your local GIT repository. Again this is a very simple operation:

[marco@bridge git] git svn rebase

Make sure you are on the master branch and that it is clean, without uncommitted changes, otherwise rebase will not work. The git stash command is very useful in this case, it allows to temporarily move away the current changes without creating a new commit.

Working on the master branch is not different than working directly on a Subversion working-copy, except you have one additional step (dcommit) to do before your changes are sent to the centralized repository. This may actually be an advantage since you can review your commits and use those nice GIT tricks like interactive rebase and amending to clean the commit history.

The main advantage to use GIT however comes with local branches, so I changed my workflow to create a local branch off of master whenever I start working on a feature, commit changes as needed to keep track of progresses, test the code, commit fixes, etc. Then when I’m satisfied I’ll merge the branch back to master and send the changes to Subversion with dcommit. Merging is where GIT shows its usefulness. A simple merge will replicate all commits from the local branch to master and consequently on Subversion, polluting the remote history with all local commits, however a merge squash will replicate all changes from the local branch without committing, this allows to do a final review of the changes, fix issues due to merging with more recent code, and commit everything as a single commit. The remote Subversion history will then be clean of all local commits and will show just a single changeset with all changes done on the local branch.

The workflow then is:

  1. Create a local branch
  2. Work on the local branch committing as needed
  3. Switch back to master, update from remote Subversion and merge squash from the local branch
  4. Review code, commit and send to Subversion

So far this GIT-Subversion workflow works fine. The development tools supporting GIT however doesn’t support the Subversion integration, so the command line tools are needed most of the times. EGit, the Eclipse GIT integration, is useful to manage the local repository but it is still immature and doesn’t offer all the options available from the command line tools. Hopefully it will offer a more complete integration in the future.

Links:

 

JUnit Tests with Eclipse Databinding

Written July 14th, 2011 by

When we started to use Eclipse Databinding framework we faced the problem of running JUnit tests against databinding-enabled classes. Databinding requires a default Realm to be defined and all classes must be run from within this realm. The Eclipse framework automatically initializes the default realm at application startup with a code that looks like the following (from Workbench.createAndRunWorkbench):

    Realm.runWithDefault(SWTObservables.getRealm(display), new Runnable() {
        public void run() {
            ULocale.setDefault(new ULocale(Platform.getNL()
                    + Platform.getNLExtensions()));
            // create the workbench instance
            Workbench workbench = new Workbench(display, advisor);
            // run the workbench event loop
            returnCode[0] = workbench.runUI();
        }
    });

 

JUnit allows to override the method used to run all tests defined in a class allowing us to add the code to initialize the default Realm like the above. Following are the two classes we are using to run JUnit tests against databinding-enabled classes.

 

JUnit 3

We wrote a DatabindingTestCase class derived from the original TestCase and overrode the run method to add the code to wrap the default realm, test case classes are then derived from DatabindingTestCase instead of TestCase.

import junit.framework.TestCase;
import junit.framework.TestResult;

import org.eclipse.core.databinding.observable.Realm;
import org.eclipse.jface.databinding.swt.SWTObservables;
import org.eclipse.swt.widgets.Display;

public class DatabindingTestCase extends TestCase {

    public DatabindingTestCase() {
    }

    public DatabindingTestCase(String name) {
        super(name);
    }

    @Override
    public void run(final TestResult result) {
        Display display = Display.getDefault();
        Realm.runWithDefault(SWTObservables.getRealm(display), new Runnable() {

            @Override
            public void run() {
                DatabindingTestCase.super.run(result);
            }
        });
    }

    public void testEmpty() throws Exception {
        // To keep JUnit happy
    }
}

 

JUnit 4

A bit more complicated because it uses annotations and test cases aren’t derived from a base class. In this case we wrote a replacement class runner DatabindingClassRunner  and overrode the run method, just like we did for JUnit 3, test case classes are then annotated with @RunWith(DatabindingClassRunner.class).

import org.eclipse.core.databinding.observable.Realm;
import org.eclipse.jface.databinding.swt.SWTObservables;
import org.eclipse.swt.widgets.Display;
import org.junit.runner.notification.RunNotifier;
import org.junit.runners.BlockJUnit4ClassRunner;
import org.junit.runners.model.InitializationError;

public class DatabindingClassRunner extends BlockJUnit4ClassRunner {

    public DatabindingClassRunner(Class<?> klass) throws InitializationError {
        super(klass);
    }

    @Override
    public void run(final RunNotifier notifier) {
        Display display = Display.getDefault();
        Realm.runWithDefault(SWTObservables.getRealm(display), new Runnable() {

            @Override
            public void run() {
                DatabindingClassRunner.super.run(notifier);
            }
        });
    }
}

 

Links:

http://wiki.eclipse.org/index.php/JFace_Data_Binding

http://www.junit.org/

 

Eclipse 3.7 Indigo released today

Written June 22nd, 2011 by

The 3.7 Indigo release of Eclipse projects is now available. This year 62 projects are part of the annual release train, an amazing 23 more projects than past year’s release Helios.

Here are few highlight of the new features found in this release:

  • Jubula provides automated functional GUI testing for Java and HTML
  • WindowBuilder, a popular GUI builder for Eclipse developers, is now open source and part of Indigo
  • EGit 1.0 provides tight integration with the Git version control system
  • Better integration with Maven, including starting Maven builds and maintaining pom files

We are very excited about this release and we are ready to develop products based on the new SDK.

More informations and downloads:

www.eclipse.org

 

Welcome to our new blog

Written June 13th, 2011 by

We are pleased to welcome you on our new blog.

The purpose of this blog is to publish articles about our activities and the technologies we use to develop our products, tips and tricks to make the best use of programming languages, frameworks, working tools, and various other things.

Enjoy.