Select Page

Anthill Pro to IBM Urban Code Build (uBuild) Migration Tool

Anthill Pro to IBM Urban Code Build (uBuild) Migration Tool is a Java Swing application designed to be run from any network location that has access to both the Anthill and target IBM Urban Code Build servers. The tool doesn’t require to run on the Anthill Pro or the IBM Urban Code Build server.

  • Multi-threaded, large instances can be migrated quickly; the level of parallelization is configurable.  You can queue up as many jobs as you want and it will work through them.
  • Migrates originating Workflows, complex and simple (i.e. ones with parallel tasks, iterations, triggers, etc.). Even things not supported by UCB’s API, we support.
  • Best-effort is made by the tool to utilize UCB’s templating structure, and to re-use jobs and the different kinds of templates in IBM Urban Code Build.
  • All steps that map from Anthill Pro to IBM Urban Code Build  in some intelligent way (the majority of steps), we are seeking 100% compatibility and we’re very close to achieving that.  For steps that don’t map over (usually because it is something IBM Urban Code Build doesn’t support), we flag it and make it easy to track down what the problem is. Most workflows will be able to run immediately once migrated.
  • Extremely easy to use, errors and warnings are written as intelligent as possible, and a log is produced from the migration which can be sent to use for review.

For more information email engineering@epicforce.net

Leroy Jenkins Plugin – Pre-release

The purpose of the Leroy-Jenkins plugin is to help integrate Leroy’s deployment functionality and configuration management with Jenkin’s presentation, artifact storage and access control abilities. This allows one to create a web based application deployment dashboard granting software, system and devops engineers to work together to bring automation and consistency to deployments using a simple xml format stored in SCM. The plugin is open source and available on GitHub.

GitHub: git@github.com:epicforce/leroy_jenkins.wiki.git
Download: https://github.com/epicforce/leroy_jenkins/wiki

Overview of Cross-Platform Build Systems Suitable for C++ Projects

As if writing source code for industrial systems in the C family of languages isn’t enough of a challenge, getting it to build properly can be an adventure too. Especially if multiple platforms are targeted. More often than not, different compilers will be used under different platforms, which will have different invocation syntaxes for similar options and/or will need to be given different options under different platforms (sometimes depending on the project). But that’s not the only difference between platforms to be taken into account when building. Other things to consider could be differences in filesystem layout, path separators, executable file requirements (.exe|.bat|.cmd file extensions under Windows vs. executable flag under *nix), shared library file extensions (.so vs. .dylib vs. .dll), different shell commands and usage syntaxes (rm in *nix vs. del|rd in Windows), different syntaxes for setting and accessing environment variables, and many-many other things. System libraries are a very common concern too – a library may or may not be available on a given platform, the available library can have different versions under different platforms, some system API functions may or may not be available and if available they could differ in parameters/semantics across platforms, etc.

The brute force approach to tackle this problem is making dedicated build scripts for each supported platform. Issues with such approach include, but are not limited to the following:

  •  Making a dedicated build script for a platform requires deep knowledge about that platform’s specifics from the project’s developer
  •  A build script needs to be created from scratch when a new platform is to be supported (unless that new platform is highly similar to another one that’s already supported, in which case a little tweaking might just be enough)
  •  Such efforts aren’t very reusable, i.e. platform-specific gymnastics often need to be performed over again when a new project is started by the same team
  •  Maintenance can be hard and daunting. A single build feature change needs to be applied for all build scripts, differently and carefully. That usually will demand the maintainer to know build scripting stuff for all platforms at hand, otherwise he/she will require help from other people for some platforms

You can find numerous projects that use (or previously used) this approach of platform-specific build scripts out there. In earlier times that has been the only affordable choice, so they didn’t have any compelling alternative back then. For example you can take a look at http://svn.wxwidgets.org/viewvc/wx/wxWidgets/trunk/build/ and http://expat.cvs.sourceforge.net/viewvc/expat/expat/ (CMake added relatively recently for Expat).

Another approach is to use one of the available cross-platform build systems that abstract away platform differences providing portable build specification mechanisms and handling them appropriately. Here is a brief overview of some that can be suitable for C++ projects:

  • CMake: Probably the most popular cross-platform build system for C/C++ projects, CMake is a feature-rich build tool having permissive license (New BSD License) and written in C++. Strictly speaking, CMake is not a build system but a meta build system: instead of invoking compilers, linkers and other such tools directly, it generates platform-specific build files (such as Makefiles, MSVC project files, etc.) as needed, then invokes native build tools (such as make or Visual Studio) to do the actual build. Thus it’s basically an interpreter for translating build scripts written in CMake’s language into the given platform’s native build script language, and delegating the task to the native build tool. Gives a feeling of being messy and intrusive though to some people, including myself.

http://www.cmake.org/
http://en.wikipedia.org/wiki/CMake

  • SCons: A somewhat low-level build system written entirely on Python, with user build scripts being fully Python as well. That plus having permissive license (MIT License), means you can do pretty much anything with it, and Python being a widely known general-purpose language certainly helps there. The downside of that is the build scripts have to be well-formed Python code which means more syntax clutter compared to some other build systems that use their own specific language. Users also commonly find themselves writing a good deal of Python code to get some nontrivial build tasks done.

http://www.scons.org/
http://en.wikipedia.org/wiki/SCons
http://www.scons.org/wiki/SconsVsOtherBuildTools

  • Boost.Build v2: Sophisticated build system with many features and excellent abstraction of platform specifics, distributed under a permissive license (Boost Software License). Consists of an engine written in C named b2 (previously bjam), and a set of files written in Boost.Jam language that comprise the meat of the build system. Built-in support for most common build concepts and the concise syntax make common tasks with Boost.Build achievable with delightfully small build scripts, while gaining out-of-the-box support for many popular target platforms. Implementing something that is not supported out-of-the-box is harder than it could have been, though. The extensions need to be written in the cumbersome and often surprising Boost.Jam language, which is not very feature-rich and doesn’t have standard libraries by itself, and the fact that the documentation is lacking makes things even harder). There is the Python rewrite of Boost.Build initiative ongoing which will make writing custom extensions for Boost.Build much easier, among other benefits. Boost.Build is mostly centered around the needs of C/C++ projects, but can be used for some other languages as well, with varying degrees of out-of-the-box support.

http://boost.org/boost-build2/
http://syrcose.ispras.ru/2009/files/04_paper.pdf

* Another list of build systems overview, comparing them to Bakefile: http://www.bakefile.org/wiki/ComparisonsWithOtherTools

Using AnthillPro like Ivy / Maven with a dependencies.xml file to define and fetch build-time dependencies

Who needs the complexity of a Nexus repository, or Ivy when you are already using AnthillPro and it provides a better, less complex and more polished solution for dependency management. AnthillPro already provides the ability for you to define your dependencies in the UI, but most people that are familiar with AnthillPro and have been working with it for years do not know that you can manage all your build-time dependencies through a dependencies.xml file that you keep in your code repository. As a software engineer, this is the ideal solution because it allows me to control my dependencies in a simple xml file and I do not need administrative access to AnthillPro so that I can manage my dependencies via the user interface.

How does it work ?

First, you will need to go into your originating workflow configuration as an admin and check off “Trigger Only Dependencies”

 

Then, you will need to create your xml file and define your dependencies. Anthill projects and Codestation projects are both supported. For those that are not familiar with these 2 concepts, an Anthill project is used for something you “make” where a codestation project is used for something you “store”. Think of this as your project that you’re building as an Anthill project, and the libraries you are using for your project which you download from different places on the internet as codestation projects.

An example xml file would look like this:



  
    
    
      
      
      
        lib
      
    

    
      
      
      
        lib
      
    
 

Setup this file as you like, and commit it to the base path of your project in your repository.

Then, in your build job, add a shell command after you checkout your code that does:

codestation -file dependencies.xml -buildlife ${bsh:BuildLifeLookup.getCurrent().getId()} resolve

That will create your dependency associations to your buildlife, you can see them in the dependencies tab of the buildlife. Now, when your build runs the step: “Get Dependency Artifacts” the actual files will be fetched into the folder(s) that you’ve defined in your xml.