Creating a new issue to encapsulate work done
under issue #19443 including:
1. Loading core & openide from a special
classloader, and including only boot.jar in the
2. Replacing the current
lib/ext/openide-compat.jar patch (using pre- and
post-processed sources) with a bytecode patching
system based on runtime superclass rewriting and
member modifier changes.
3. (To be done) Write unit tests confirming that
particular source-incompatible but
binary-compatible changes we have made, in fact work.
4. Still awaiting explanation of what can be done
using superclass/member bytecode patching more
easily/cleanly than using traditional override
patches based on the pre- and post-processors. One
possible advantage is that all details of the
compatibility mode can be separated into an
independent source tree, rather than cluttering
the actively developed source files.
5. If it is desirable to conditionally load
patches for openide (and maybe also for
modules...) only if an old client exists, this
should be opened as a separate issue blocking
#19443: it could get pretty complicated (requires
restart of NB to take effect etc.).
Branch creation is in progress, CVS is just deathly slow today...
OK, in nb_all in modules:
core (minus www)
openide (minus www)
nbbuild (minus www)
xtest (minus www)
and in nbextra in:
I have created a branch patchbytecode_26126_branch, rooted at the base
tag patchbytecode_26126, and committed the working patch to this branch.
I have implemented new style of patching - ability to change
access_flags of class and its members. Thus I could delete most (but
not all) of the XYZPatch classes.
I still have to work on a test that will compile classes against the
old API and execute them against the new API. Should be ready tomorrow.
Still would like to see a point-by-point comparison of this style
versus traditional classpath patching before this patch can be merged.
One negative point I forgot to write down before: with the old
lib/patches/openide-compat.jar, you could if you wanted compile
against the old deprecated stuff simply by including that JAR first in
your compiler's classpath. That is impossible with runtime bytecode
patching - you would need to download an older NetBeans release and
compile against its openide.jar.
An advantage of both styles: you can separate openide/src/ into
multiple independent subdirectories. With pre- and post-processing,
you do regular compilation in each subdir, but all the processed
sources get stuck in one big dir where they can access all of
org.openide.** during patch compilation. With superclass patching, you
would compile just the patches with all of org.openide.** in the
classpath. So either style can support #19443.
A disadvantage of both styles, as mentioned before in #5, is that you
cannot reliably turn the compat mode on or off without a restart. I
don't think this is solvable though, because of the way Java class
The primary drawback of this new style, though, is that it is
substantially more complex to implement, to understand, and presumably
to debug. So it should not be used unless it permits a whole category
of more powerful transformations that we know we want to have and
which cannot be done just by prepending to the effective runtime
Implementation: I think the code is in state for another review. An
ant task that takes JAR file and extracts all "enhanced" classfiles
out of it after applying the patch on them has been written, so if
somebody needs the backward compatible classes, he can just execute
A testcase was created that is compiled against classes generated by
the above mentioned task. Then it is executed in the IDE (without
those generated classes) to proof that the bytecode manipulation works.
Currently there is four tests that do not pass. All of them fail on
making an innerclass public. But that seems like a small easy to fix
Long time ago when we first realize that we have problems with
backward compatibility, I was glad for the preprocess and postprocess
tasks, they have served well and helped us to solve all problems that
But as time passed, the drawbacks of that solution were slowly
appearing. These problems are symptomatic to this sort of solution: we
have two versions of truth (openide.jar & openide-compat.jar) and this
is always tough: the both JARs are connected, cannot be upgraded
separatelly, need to be present in the right order, but they are
packaged separate and reside in different directories. This is not
So I started to seek for a different solution. At that time the
patching of bytecode was just crazy idea, but the more information
about the subject I gathered, the more reasonable solution it started
- first of all Tom Ball visited us in Prague and we were discussing
the possiblity doing such bytecode manipulation and he seemed
surprisingly open to that solution and nearly pushed us to do that.
Well, Tom is a kind of hacker, but also seems to know how to develop
- Tom also suggested to read "Component Development For Java Platform"
that should probably be called "Hacking For Java Platform", but which
talks about development of a component system that evolves in time,
connects different pieces together and can guarantee that everything
works. The writer is storing the additional information together with
class in its attributes and pretends that this is the natural way of
doing that. I somehow started to believe in it too.
- I should probably mention another thing that ensured me that the
bytecode attributes are going to play more important role is JSR 175:
http://www.jcp.org/jsr/detail/175.jsp There is an effort to enhance
the Java language to make it more simple to annotate classes and
members by attributes. After that will be implemented we will not need
any preprocessing, the regular javac will insert the attributes into
the bytecode itself.
- Last comment is about the complexity of bytecode. For few years I
lived in java world and I never understood the content of classfile.
It was something like .exe. But as soon as I started the work, I
realized that it is in no way that random, that it is structured and
that patching signatures, changing constants is just piece of cake.
So this is my background, I hope it helps explain my opinions.
Comparation of patching and source preprocessing:
Jesse I think that your comparition is in most cases correct. I can
confirm that #5 is complicated and thus implementation imaginable just
Ad. "compile against the old stuff" this is not a problem. Because the
old class files can be generated by an ant task. So if somebody wants
to compile his old application against old api, he just executes ant
task and generates the classes.
One thing that was not yet written here is that we need patching to be
accessible to all modules that provide API. Their APIs will soon
contain the same garbage we have in openide, so the patching or
preprocessing needs to be available to all modules. I think that this
is an argument in favor of patching, because my nightmare vision of
preprocessing a duplication of set of jars: java-api, java-api-compat,
diff-api, diff-api-compat, etc.
You also tried to compare the implementation/maintainance cost of both
approaches. I can try something similar. The patching stuff has unit
and integration tests written - preprocessing does not. The patching
part is well separated from the rest of the system into one class. I
have mentioned that patching byte code is not too difficult. And also
I'd like to mention the JSR 175 again - when this will be implented
(and it will be, Joshua is very active)half of our code will no longer
be needed - the "enhancing" of the classfile will be done by the
As a last comment for today I want to amphasize on the "two versions
of the truth" problem and the necessity to patch modules as well as
openide. Of course we can choose different solution than this one, but
let's choose the one that will solve those two problems.
I believe the work on a branch can now be considered finished.
Patching of DataLoader$FolderLoader, etc. works well (were made
outterclasses with $ in a name, because the JVM somehow ignores all my
attempts to patch accessflags to innerclasses).
All patches were renamed to contain $ in a name. So Repository's
superclass during runtime is org.openide.filesystems.$Repository$Patch$
The openide-compat.jar has been deleted and instead the patches are
included in regular openide.jar. Those that could be were made package
public (possible if the classes were just adding implementation of
some interface, impossible for AbstractFileSystem due to
AbstractFolder refreshRoot and SystemAction.get/setIcon).
I think that this is all I can do in code. So I am moving the issue to
Jesse to decide what and when to merge.
*** Issue 25436 has been marked as a duplicate of this issue. ***
ran with org.netbeans.log.startup=print, netbeans.close=true, measured
real (CPU+system) time as usual; JDK 1.4.0_01 on Linux 2.4.7, ext2
laptop drive, 1.2Ghz, 1Gb RAM, light machine load otherwise; after
clean build + sanity check, 2 primer runs, then 10 real runs.
without patch: avg. 14.31 sec
with patch: avg. 14.32 sec
i.e. the same, within experimental error.
I'm afraid the patching code is buggy. If you remove the flag to
suppress bytecode verification in ide.cfg, superclass-patched classes
throw VerifyError's, because the constructor super calls do not match
the actual superclass. Please fix in the branch.
java.lang.VerifyError: (class: org/openide/filesystems/Repository,
method: <init> signature: (Lorg/openide/filesystems/FileSystem;)V)
Call to wrong initialization method
java.lang.VerifyError: (class: org/openide/util/actions/SystemAction,
method: <init> signature: ()V) Call to wrong initialization method
I am making a few misc. changes in the branch which I found necessary
while attempting a merge.
Please confirm that a clean build and full xtest run really works:
src=/space/src/patchbytecode_26126 # e.g.
nice /space/ant/bin/ant -emacs -f $src/nb_all/nbbuild/build.xml
-Dbuild.compiler=modern real-clean build-nozip sanity-check
nice /space/ant14/bin/ant -emacs -f
$src/nb_all/xtest/instance/build.xml cleantests buildtests runtests
Specifically check that PatchByteCodeTest and CompatibilityTest run.
For the first, you probably never run full XTest, because it fails -
lib/test.jar does not contain the data (I fixed). For the second, it
was not included in the standard xtest config (I hopefully added it,
Right, I have not tried to run without verification off. I'll
investigate. I've seen your commit, thanks for english cleanup and
especially the updates to xtest.
Fixed. I had to change the reference to super.<init> to refer to
patchsuper.<init>. Seems to work after I did following checks (on JDK1.4):
cd nbbuild; ant real-clean build-nozip sanity-check
cd xtest; ant cleantests buildtests runtests
52 tests executed. All of them succeeded including 17 from openide/compat.
I believe that everything works now. But even if it were not, I could
do nothing about that as I am leaving for a week vacation in 10
minutes. Please Jesse, look at it and find new bugs, I'll address them
when I return back.
new revision: 22.214.171.124; previous revision: 126.96.36.199
Processing log script arguments...
More commits to come...
Checking in core/src/org/netbeans/PatchByteCode.java;
new revision: 188.8.131.52; previous revision: 184.108.40.206
Processing log script arguments...
More commits to come...
Checking in core/test/unit/src/org/netbeans/PatchByteCodeTest.java;
new revision: 220.127.116.11; previous revision: 18.104.22.168
Checking in core/test/unit/src/org/netbeans/Sample.java;
/cvs/core/test/unit/src/org/netbeans/Attic/Sample.java,v <-- Sample.java
new revision: 22.214.171.124; previous revision: 126.96.36.199
Created attachment 7101 [details]
Revised patch incl. XTest changes
Created attachment 7102 [details]
One note: commit includes removal of -J-Xverify:none from the trunk
ide.cfg; will add note to release checklist that it should be added in
every release branch. This ensures that trunk dev builds will report
any classfile errors (from this patch or from any other reason, e.g.
mismatched compiles), while release builds will still start faster.