Starting an adventure with Gradle

We’ve been getting back into gear to start development of CoSolvent Flash Player version 2; which will really be the 5th major release of our core logic as the player was previously a more or less nameless piece of internal code.  The new version will be based on Adobe’s OSMF which we’ve been following with interest for quite some time now.

We made a couple decisions up-front, and one of them was to import the latest release code drops from OSMF into an internal git repository; that way if we ever need to make any patches we have the code right there, and anyway as with any open source project it can be very useful to go through the source code to understand how best to use it.  So that decision brought on the next, which was: do we go ahead and setup to build the osmf.swc library ourselves.  After debating this, we decided that it would provide useful experience.  Alas the OSMF SVN lacks a decent build script, seeming to rely on FlashBuilder.  Luckily the Strobe Media Player does have a basic set of ANT build scripts.  Now, I’ve been using ANT for quite some time, and everytime I have to use it on a new project I ask myself: isn’t there anything better?  So far the answer has been “no”.  So, I started working through an attempt to modularize the SMP build script.  I believe I managed to get something that was workable, while being fairly ugly; however along the way I decided that the build for CFP 2.0 should be cleaner if at all possible and decided to go though another search.

Of course one of the obvious tools for any Java developer such as myself has got to be Maven, although up until now I have avoided it, having found it approach too constrictive for my comfort, and that making the transition seems to involve an awful lot of new things to learn, which are not super well documented (not that most other open source tools are really much better).  We also have a (so far) internal tool built using the Eclipse RCP framework, and had also started to look at the e4 projects as the next logical step there; but again, the Eclipse build tooling always has struck me as incredibly non-user friendly.  The big problem we run into is that, because a number of tools that we want access to are part of various plugins (such as XML editing), we end up shipping a whole bunch of the IDE functionality, and that creates a huge set of dependencies to manage.  Now, if you have all day, you can simply re-provision them each time you need to build, downloading the latest stable releases, but that is not going to work for incremental builds.  So we are left with the question of managing a “target platform” full of modules (features and related plugins) that we want to keep up-to-date, and we don’t have a separate build developer.  One of the developments that caught our attention was the Tycho project, which seemed to address some of the issues, and since Tycho is built on top of Maven3 that seemed to provide some added incentive to consider moving to Maven.

Starting out with maven was somewhat ugly, as the maven way of doing things seems to be mostly derived from the process that a Java project might need, but we kept learning.  There is a “flex mojo” to add the ability to call mxmlc and compc from within a maven build as a semi-standard part of their lifecycle; however we quickly ran into a problem.  We need to control the version(s) of flex we are compiling with/against.  Apparently this used to be possibly, however after getting a few too many complaints from users who’d shot themselves in the foot with a path issue, the one guy who maintains the flex mojo basically decided to do his best to make this “impossible” for new users (see his discussion board postings).  In addition to creating technical problems, the attitude also raised a number of questions in our minds about if we really wanted to go that route.  In the end we decided that since the Tycho tool chain was not yet ready for us (or we weren’t ready for it), there was not really a compelling reason to use Maven.  So it was back to the drawing board.

After looking at numerous tools, I finally rediscovered Gradle.  I say rediscover because I’d previously seen it, when it was in a much less mature state.  Anyway, a quick google showed that not only would it be possible, due to Gradle’s built in support for ANT tasks, to use the Adobe mxmlc and compc tasks, but that just recently work had started on a plugin for adding even better support for Flex development into Gradle as a plugin – GradleFx

So, I started working on a test of using gradle to compile the OSMF code and plugins.  At this point, as I’d anticipated, I managed to run into a number of beginner issues; however I have been able to work my way through them, with the kind help of input from the GradleFx developers.  One thing I’ll note here is that you can avoid having to “install” the GradleFx plugin into a local maven cache by simply putting it in a local folder and declaring that as a “flat repo” like:

1
2
3
4
5
6
7
8
buildscript {
     repositories {
        flatDir name: 'localToolsRepository', dirs: "${buildTools}/lib"
    }
    dependencies {
        classpath group: 'org.gradlefx', name: 'gradlefx', version: '0.2.1.ipov', ext: 'jar'
    }
}

Note that the buildscript closure is evaluated right away, before the rest of the build, so if you want to use any variables, such as ${buildTools} in the example above, then you have to define them in a gradle.properties file.

I found that cloning the GradleFx repo enabled me to look over the code and get a better feel for what its doing, as the documentation is still a little light; however I soon found that I had a couple of ideas for some changes to the GradleFx code.  Since this is on GitHub, I decided to try my first ever GitHub hosted fork.  The process seems to have gone well, and so far I’ve been able to contribute a small change in the path handling.  I am currently working on code to add explicit includeSources and includeClasses properties to at least the compc code, as currently the compc implementation is using all the declared source-paths as include-sources too, which is going to work most of the time, but not all.  This should then mean you can reuse .flexLibProperties – something that the SMP build does, but much more simply.  I have the following as a part of a multi-project build (OSMF comes as several different components that I’m building separately):

120
121
122
123
124
125
def flexLibProperties = new XmlParser().parse( project.file('.flexLibProperties') );
def incClasses = [];
flexLibProperties.includeClasses.classEntry.each {
    incClasses.add( it.'@path' );
}
project.tasks.compile.includeClasses = incClasses

I am considering pushing my OSMF repo to GitHub, but my big hang up is that right now I am unsure how much TLC I’m going to be able to give it, and I don’t want to create any false expectations from others.  For instance, if it becomes clear that we’re not going to need to have the ability to build OSMF ourselves, then it would be much simpler to just use the latest binary.  However, in the mean time I intend to try and post a few other tips and try to contribute any other ideas I have to the GradleFx project.

Print Friendly
Share
Posted in CoSolvent Flash Player, Flash / Flex | Tagged , | Leave a comment

Eclipse Infocenter Help as Web App

For all the effort that appears to have gone into allowing the Eclipse Help “Infocenter” to run as a deployed web application, the documentation is not great, and more importantly – why isn’t there a download thats “ready to go”?  This is something that’s bothered me for sometime now.

For the impatient, you can skip the rest of this, and just download a WAR from my Google Docs account at https://docs.google.com/a/ipov.net/leaf?id=0B1u9Agj_s0HLZWZhNWFjNzItNzI4Ny00NzAxLTlhMmMtYjhiZmIxYzdlOGQ5&sort=name&layout=list&num=50 Just expand the WAR and add your help plugin(s) to the WEB-INF/plugins/ directory.  Make sure that your plugins to not depend on anything other than the Eclipse Help bundles and you should be fine.  I’ve noticed while preparing this, that a number of help plugins/bundles actually declare the same dependencies as the plugin they document, thus making running standalone impossible as the OGSi container will refuse to activate them.  Hopefully someone will take over hosting this WAR for me, it would be great if Eclipse.org made it available for download as part of their normal release schedule, but that’s asking for a lot.

Anyway, in case its useful to others, or I need to reference this in the future, here’s what I had to do to get this WAR file running.

First, I google’d the issue; I found a smallish number of blogs already out there, and the offical documentation (e.g. http://help.eclipse.org/helios/index.jsp?topic=/org.eclipse.platform.doc.isv/guide/ua_help_war.htm).  Warning: Many of the blogs are for an older version of the runtime, and will instruct you to place files under WEB-INF/eclipse, this is incorrect and will result in the web-app failing to start.  The instructions in the Eclipse help are correct.

From the instructions:

  1. This is correct enough, although you don’t really have to use a temporary directory, I just extracted the plugin JAR directly to my local development install of Tomcat to facilitate testing, e.g. to %TOMCAT_HOME%/webapps/help
  2. Note that you’ll need Eclipse Plugin Development Kit for this.  I also ran into a problem with using my install as my target platform, some issue with the JUnit addon for PDE, so I had to define a platform and exclude the JUnit plugin, not a big deal, but an extra step.
  3. Note that this step assumes you extracted the “temporary” directory referenced at the start as “webapps/help”
  4. Again, not that if you copy help plugins which declare extra, usually unneeded, dependencies, the help system will start, but will exclude those help plugins.  You can edit the WEB-INF/web.xml to add the -console option so that you can access the OSGi console and get some diagnostic information on why the plugins aren’t started (it isn’t logged to the Tomcat log files).
  5. Downloads of the ServletBridge.  I haven’t experimented, but I placed both files into my webapps/help/WEB-INF/plugin, and then also copied the org.eclipse.equinox.servletbridge to my WEB-INF/lib directory.  This seemed to work fine.
  6. Note that previous versions of the instructions tell you to “extact the servletbridge.jar” form the plugin jar, this is no longer correct.
  7. No other variations/extra information should be needed.

Comments and addition information are welcome.  I hope this helps others, and maybe someone can eventually get the Standalone Infocenter WAR added as a standard part of the Eclipse release builds.

Print Friendly
Share
Posted in Technology | 4 Comments

Milestone 32: CoSolvent Comunity Server

I am pleased to announce that we released Milestone 32.1 of CoSolvent Community Server last week.  The main new functionality in this release is that FLV files can now be stored and served “as is” – previously FLV uploads were treated just like any other video upload and were reprocessed into 3 different streams (configurable) targeting different bandwidths.  Now sites can choose to use the original FLV upload as is.  If this option is used, the alternative bandwidth streams are not created, so think about how you plan to use your site before enabling this feature; enabling video reprocessing of FLVs after the fact will require the maintenance tools to be used to create the streams, a process which could take quite a long time depending on the amount of content.  However, if you are mainly interested in archiving of FLV content, and have preprocessed FLVs which you don’t need or want to reprocess, this option can cut down on both processing load and also on the space requirements, as well as assuring you that your users always see the original quality video.

Additionally several bugs were fixed, you can see the complete issue list in Assembla at https://www.assembla.com/spaces/cosolvent/milestones/283241-milestone-32—video—pdf-improvements.

Print Friendly
Share
Posted in CoSolvent | Leave a comment

Milestone 31: CoSolvent Comunity Server

Last night we released Milestone 31 of CoSolvent Community Server to production.  Among other things this updates PDF display for IE to use the non-standard <object> tag syntax using the classid attribute and the src param instead of the type and data attributes used in standards mode.  The standards mode tag works, mostly; however IE seems to block page execution while downloading the PDF when in standards mode (regardless of the “Fast Web View” setting of the PDF), something which is obviously a big problem for the Video-in-PDF documents we’re hosting.  The new mode works much better,with smooth downloads for IE.

Additionally, PDFs are now opened with the site navigation menu minimized, giving the PDF approximately 200 more pixels horizontally.  We added a expand/collapse menu button which can be used to open the menu.  Other file types open with the menu auto expanded, although the user can collapse it.  Our next milestone should move this from a totally automatic behavior to something that administrative users can set in the file properties.

Other changes:

  • We upgraded to using Dojo 1.5
  • The look of the Comment section has been improved, and couple of behind the scenes tweaks made.
  • In the Manage >Permissions, the dialog to select Users or Groups to assign permissions to can now be navigated via keyboard (arrow keys to next/back, or press a character to go to the next entry starting with that character).
  • When re-uploading a file you can now choose to overwrite the existing title, description, and summary using new data.  Previously you had to re-upload and then go to “Edit Properties” to change this information.

We’re looking forward to the next Milestone, which we have only a few items in so that we can complete as soon as possible.  The main changes will be automatic lineraization (on user choice) of PDFs so that “Fast Web View” can be used, and the ability to avoid reprocessing FLV files which are uploaded.

Print Friendly
Share
Posted in CoSolvent | 1 Comment

DITA Table of Contents

Update: I still intend to publish the full code for this at some point soon, most likely via github, however I have gotten pulled away onto several other projects, and have not yet been able to get this into a version that I an satisfied with.  Additionally, I will need to move the code out of our SVN, were it is currently stored side-by-side with the documentation XML (which is stored in SVN because that’s what everyone here is familiar with).

I’ve been interested in doing some work with DITA, and currently the DITA-OT processing code seems like the main viable way to do so, at least without paying an arm and a leg.

One of the first things I noticed, other than the spotty documentation of the toolkit (talk about the cobbler’s family being the last to get new shoes!) was that the basic xhtml output is, well ugly.  So, the first thing I did was to open up some output using Firebug and take a look at the generated styles.  “Ok,” I said to myself, I can use a custom CSS file to fix the majority of the issues here… then I looked at the main table-of-contents.  The ToC file has no class attributes on any elements, and imports the same CSS file as the content pages.  The content pages have class attributes, so it should be possible to write some CSS to target both, but I really didn’t like the idea of leaving the CSS for my ‘menu’ so generic, it would be just too easy to have it accidentally apply to some other elements or something.  So I started digging into how to customize the ToC output.  It turns out there isn’t any extension point to override the processing in the way there is for the main xhtml, or other main content types.

I looked at the tocjs plugin, which does some nice effects, using the YUI tree; but its pretty basic (the tree) and we tend to use either jquery or dojo, so I really don’t want to add another toolkit to the mix there.  What the tocjs plugin does is to make use of the intergrator step to insert some custom Ant targets into the main build files, defining a new transtype called “tocjs”.  The downside to this, is that you have to run the transform process twice -once to generate the topic html files, and a second time to generate the ToC file(s).  It also annoys me that you have to update the base build files – it just seems wrong that a plugin should change files in its ‘host’.

So, I started digging around in the XSL transform files and the Ant build files.  It took a while, as there is not much documentation on them, at least not much when compared to the number of files and targets defined.  Eventually I found that the dita-ot/build_dita2xhtml.xml file defines a target named “dita.map.xhtml.toc” which is what calls the XSL transform to create the ToC.  However, there are no extension points for this.

I looked at the *_template.xsl files; however I didn’t feel like digging through the DOST code to find out what it would take to define a new XSL extension point.  Maybe its really easy.  Even if it is, it would require changing core files, and possibly interfering with any upgrade path.  If it is really easy, someone let me know, I could probably create the template file and send it as a patch. I did find one important bit of information which might help you even if you don’t ready past this point. There is a args.xhtml.toc.class parameter that you can pass into the standard transform to assign a CSS Class to the body element of the ToC. I have added this to the DITA Wiki, hopefully this helps out someone.

So, I am now thinking how to wire this all together.  The good news is it is possible, the bad is that altering the core build xml (and xsl) is simply the way the current integrator works.  The approach I took uses Ant import (not XSL import) to “override” the base “dita.map.xhtml.toc” with my own target, which is basically a copy of the base that points to a different XSL file.  Once I get the files in a little more advanced state I will probably release them as open source, although from what I can see if someone wanted to changed the main XSL file dita2htmlmap.xsl that wouldn’t be too difficult as well, and shouldn’t have any effect on existing code (other than adding a few extra @class declarations.

Step 1 : Create a plugin

Under dita-ot/plugins   (or demos if you prefer) create a new folder.  I named mine net.ipov.ditaot.toc.  In this folder add a plugin.xml file, you can copy an existing file and edit it too.  The plugin.xml should look like:

&lt;plugin id="net.ipov.ditaot.toc"&gt;
 
    &lt;!-- We want to dump the contents of the build_plugin.xml file into the main build.xml  --&gt;
    &lt;feature extension="dita.conductor.target.relative" value="build_plugin.xml" type="file" /&gt;
 
&lt;/plugin&gt;

Step 2 : Create a copy of the XSL

I won’t include the full source here, as it is getting a little long.  But the basics are:

&lt;xsl:stylesheet version="1.0"
    xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:saxon="http://icl.com/saxon"
    xmlns:xt="http://www.jclark.com/xt"
    xmlns:java="org.dita.dost.util.StringUtils"
    extension-element-prefixes="saxon xt"
    exclude-result-prefixes="java"&gt;
 
    &lt;xsl:template match="/*[contains(@class, ' map/map ')]"&gt;
        &lt;!-- change output of the top-level ul and add anything else you like. --&gt;
    &lt;/xsl:template&gt;
 
    &lt;xsl:template
      match="*[contains(@class, ' map/topicref ')][not(@toc='no')][not(@processing-role='resource-only')]"&gt;
        &lt;!-- change output of the li items in the ToC --&gt;
    &lt;/xsl:template&gt;
 
&lt;/xsl:stylesheet&gt;

Step 3 : Add the build_plugin.xml file to declare the build target.

This is where you define the Ant target, which when pulled in by the integrator will override the built in target of the same name.  Note that this could possibly break in the future if the integrator changes the order of code to be before rather than after the imports (I think).  At first I couldn’t get both my override to work (it doesn’t work if the build_plugin.xml imports it) and get a file path that was relative to the plugin directory; however I stumbled on the answer in the dita-ot Yahoo Group while looking for some other information (on code highlighting). A little further experimentation and I was able to get somethign that worked.

First create a build file, must match the name declared in the plugin.xml, so here it’s called build_plugin.xml

&lt;project&gt;
 
  &lt;import file="build_toc.xml"/&gt;
 
  &lt;target name="dita.out.map.xhtml.toc"
        unless="noMap" if="inner.transform"
        description="Build HTML TOC file,which will adjust the directory"&gt;
 
      &lt;!-- define your own xsl, note you have to specify the ./plugins/NAME_OF_PLUGIN/ path here --&gt;
      &lt;xslt
       style="${xtoc_plugin.dir}${file.separator}xsl${file.separator}xhtmtoc.xsl"
       ... &gt;
          ...
      &lt;/xslt&gt;    
 
  &lt;/target&gt;
 
&lt;/project&gt;

Now we need a second build file, the build_toc.xml referenced in the first:

&lt;project name="xtoc_plugin" basedir="."&gt;
	&lt;dirname property="xtoc_plugin.dir" file="${ant.file.xtoc_plugin}"/&gt;
&lt;/project&gt;

When the build_plugin.xml is merged into the core build files, the import is also copied in, with a corrected path. Then the path the plugin is captured in the imported build_toc file, which can be used in the ‘included’ target. Kind of tricky, and it feels a little “hacky” to me, but it seems to be working.

Print Friendly
Share
Posted in DITA, Technology | 1 Comment

Milestone 29: CoSolvent Comunity Server

Wow, so much going on that I forgot to post Milestone 28!  Anyway, I am truly pleased to be able to announce that we’ve finished Milestone 29 on CoSolvent Community Server.  As usual the full list of changes (in developer speak) can be found on on our Assembla page https://www.assembla.com/spaces/cosolvent/tickets,  just use the “Closed by Milestone” filter as the Milestone is now closed, so it no longer shows up in the default filter. This release has some hiccups due to finishing updating our build system to work with the git version control system, but overall the migration to git has been pretty smooth.

This release adds some polish to the user interface, fixing the email auto-completion feature, adding ‘loading’ icons to some ajax components, and adding filtering of users by groups to the management interface.  We also implemented “versioned assets” – in this case a relatively “dumb” way, by simply adding -MILESTONE_NUMBER to the the javascript directory.  This should fix issues we’ve had in the past with user’s having outdated assets stuck in their cache ruining their experience after an upgrade.

Print Friendly
Share
Posted in CoSolvent | Leave a comment

Milestone 27: CoSolvent Comunity Server

The development team here at iPOV is pleased to announce the release of CoSolvent Community Server, Milestone 27. This release continues to improve stability and expand the feature set. The main changes in this release are:

  • Improvements to Commenting, including the addition of “Sort by Video Timing” to sort the comments based on the time on a video that the comment is associated with (applies only to video files).
  • Visual formatting tweaks, including decreasing unneeded space between item previews in the “Grid” folder view mode, this greatly improves the layout for users with a 1024 x 768 screen.
  • Addition of item flags for “Has comments”, “New Item”, “Changed Item” to help visually scan a folder contents.
  • Further improvements to the upload control:
    • Addition of a “Keep Alive” using a Javascript timeout to keep the user’s session alive when on the upload page.  We’ve had issues with users going to the upload page, getting distracted and moving to some other application / task, then returning to the still loaded upload page and attempting to upload files, only to get an error when the (often large) file finishes uploading and the server fails to process it due to the user’s browser session being timed out due to inactivity.  This change will fix this.  The “keep alive” is also triggered immediately before running an upload which should cause most other errors to be detected before a user actually has to wait for a file upload.
    • Errors are now hidden by default, clicking on the “Errors” link in the upload status bar loads the list of upload errors; each error in this list is now cross linked to an online help page (http://www.ipov.net/apps/ccs-support/articles/cosolvent-upload-errors/) which explains the error in more details (and which can be updated faster than the application text).
    • Grid used to display files that are queued for upload has some visual improvements to make it more usable.
  • Continued to work on updating the PHP code to run properly on PHP 5.3

The complete list of issues addresses can be seen at https://www.assembla.com/spaces/cosolvent/milestones/240821-milestone-27

Additionally, this release should mark the last development done in our SVN repository; we are transitioning to use of git to take advantage of its cheap branches and other distributed features.

Print Friendly
Share
Posted in CoSolvent, Technology | Leave a comment

Two Underutilized Software Training Technologies

There are two technologies that are familiar to many software professionals, but somehow don’t see much use in the rest of the organization. The first technology is computer screen movie capture (aka screencasting) and the second is the ability of virtually every laptop and most video cards to support multiple monitors.

Read More »

Print Friendly
Share
Posted in Applications | Tagged , , , , | Leave a comment

Transforming Video: Non-destructively!

Video is far more than a publishing or entertainment medium. It can also be a powerful tool to record expert knowledge. Unfortunately, the video that emerges from a typical knowledge capture session is too fragmented, error-filled and poorly structured to be used directly in professional presentations or documentation. A video producer will have to make extensive (and expensive) edits to render the video clips presentable. Typically, the producer will use sophisticated software (e.g., Adobe Premiere and Apple Final Cut) to slice, dice and dramatically transform the raw material. However, conventional tools create two problems that complicate later revisions:

Read More »

Print Friendly
Share
Posted in Applications | Leave a comment

Engineers Love Writing Manuals! – Really!

Problem Statement

You customized an ERP module with 15 new fields, 4 new screens, 5 significantly changed screens, a significantly different workflow, and an updated graphics ‘theme’. It takes you about 30 min just to walk a co-worker through all of the new features and methods. Now, you’re tasked to deliver end-user documentation and training materials ASAP. Your options are:

Read More »

Print Friendly
Share
Posted in Applications | Leave a comment