Thread: Pre-processing during build

Pre-processing during build

From
Stephen Nelson
Date:
I'm working on a change proposal to switch the build of the driver
from Ant to Maven [1].

Working on this change I found the use of pre-processing of template
files to create valid Java classes. In my experience this isn't very
common in Java libraries, other than to inject version numbers. To me,
it does not seem to be a very elegant solution, although I can confirm
that it works fine and has been present in the driver since the
beginning.

The use of these templates is scattered around the codebase and
usually occurs when there is JDBC version-specific functionality to
use, e.g. Driver.java.in and PGConnectionPoolDataSource.java.in. [2]

The use of these templates means the code will not successfully
compile in an IDE, without running the build script first. I believe
the historical reason for part of this is to retain the
org.postgresql.Driver class to be used in a Class.forName method call
without the developer needing to worry about the JDBC spec version.
Please correct me if I'm wrong in this assumption.

Within the current build there is also the filtering of source
packages to only include the version of the JDBC spec you are
compiling for and any previous versions.

This can all be implemented in Maven, so won't prevent the project
using that for a build system. However it will make it possibly more
complicated than it needs to be, like the Ant build is now. The
negative of this is that it could put off new people building and
committing fixes and functionality to the codebase.

So, do we want to change this? It would be good to discuss this to
arrive at a solution that would best serve the users as well as
contributors (I'm aware that is rarely possible!)

[1] https://github.com/pgjdbc/pgjdbc/pull/322
[2] https://github.com/pgjdbc/pgjdbc/pull/322/files#r32378395


Re: Pre-processing during build

From
"Markus KARG"
Date:

Stephen,

 

thank you for starting this thread.

 

If it would be up to me, I would try to get rid of pre-processing if any possible, since it is a real p.i.t.a., as long as we can find a different solution to provide the same number of supported JDKs and JDBC versions.

 

The question is: How? Possibly by simply using "JRE8-JDBC42.jar" ALWAYS?

 

Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6? I mean, not to actually invoke the new JDBC42 APIs, just to load the JAR and invoke the JDBC3 APIs only for example. The APIs themselved are backwards compatible, and as long as we don't invoke the new APIs, no ClassNotFound should happen (AFAIK the JRE loads classes only at first actual instantiation, but not simply because it is contained in a loaded .class file as a parameter). I mean, as long as we do not use JRE8-only APIs inside the Driver, and as long as we don't write the .class files in JRE8 byte code, certainly.

 

Or did I miss something in this theoretical approach?

 

Regards

-Markus

Re: Pre-processing during build

From
Dave Cramer
Date:
How can you use the same name and different bytecode versions ?

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 15 June 2015 at 17:59, Markus KARG <markus@headcrashing.eu> wrote:

Stephen,

 

thank you for starting this thread.

 

If it would be up to me, I would try to get rid of pre-processing if any possible, since it is a real p.i.t.a., as long as we can find a different solution to provide the same number of supported JDKs and JDBC versions.

 

The question is: How? Possibly by simply using "JRE8-JDBC42.jar" ALWAYS?

 

Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6? I mean, not to actually invoke the new JDBC42 APIs, just to load the JAR and invoke the JDBC3 APIs only for example. The APIs themselved are backwards compatible, and as long as we don't invoke the new APIs, no ClassNotFound should happen (AFAIK the JRE loads classes only at first actual instantiation, but not simply because it is contained in a loaded .class file as a parameter). I mean, as long as we do not use JRE8-only APIs inside the Driver, and as long as we don't write the .class files in JRE8 byte code, certainly.

 

Or did I miss something in this theoretical approach?

 

Regards

-Markus


Re: Pre-processing during build

From
Sehrope Sarkuni
Date:
On Mon, Jun 15, 2015 at 6:17 PM, Dave Cramer <pg@fastcrypt.com> wrote:
How can you use the same name and different bytecode versions ?

Short answer is you can't.

Longer answer is, as long as the classes don't use types that aren't defined in the newer interface, you can include implementations for methods that aren't defined in an older interface definition. It's only a problem if those methods use types that don't exist (ex: if a function signature used the new Java 8 time types). You'd also need to either compile with the older target bytecode version or actually build multiple versions off the same source.

I think a purely Maven based build with separate targets should be possible via a generated-sources plugin. Off the top of my head I'm not sure which plugins it would use though. It's been a while since I wrote anything like that though I do remember it being a bit of a pain to get right. On the plus side we only have to figure it out once right? :)

I'm a big fan of Mavenizing the build process. A lot of the value of it will come from how it will simplify things like adding tests. It eliminates a lot of the double and sometimes triple entry (i.e. add the test class, add it to a suite, add the suite to build.xml).

Regards,
-- Sehrope Sarkuni
Founder & CEO | JackDB, Inc. | https://www.jackdb.com/

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
Marcus,

>Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6?

As you are a fan of JEPs, you might know of
http://openjdk.java.net/jeps/238 : Multi-Version JAR Files
It does not "just work" yet.

Suppose you want implement PreparedStatement(...java.sql.SQLType)
features of JDBC 4.2 (see [1]).
Even if you compile that with target 1.7, JRE 7 might fail to load the
class as it won't be able to validate what that SQLType is. It just
does not exist in JRE7.

As far as I understand, the only manageable way of using "new
features" in "JDK6-7 jars" is to isolate JDK8-using methods to
JDK8-only-loaded classes. For instance, PreparedStatement41.java and
PreparedStatement42.java

[1]:
https://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#setObject-int-java.lang.Object-java.sql.SQLType-

Vladimir


Re: Pre-processing during build

From
Dave Cramer
Date:
I'm wondering if we use the latest JRE to compile but target 1.7 will it use the latest JDBC API, but compile to 1.7 bytecode. Apparently https://github.com/brettwooldridge/HikariCP uses this method; and they have a similar problem

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 15 June 2015 at 18:47, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
Marcus,

>Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6?

As you are a fan of JEPs, you might know of
http://openjdk.java.net/jeps/238 : Multi-Version JAR Files
It does not "just work" yet.

Suppose you want implement PreparedStatement(...java.sql.SQLType)
features of JDBC 4.2 (see [1]).
Even if you compile that with target 1.7, JRE 7 might fail to load the
class as it won't be able to validate what that SQLType is. It just
does not exist in JRE7.

As far as I understand, the only manageable way of using "new
features" in "JDK6-7 jars" is to isolate JDK8-using methods to
JDK8-only-loaded classes. For instance, PreparedStatement41.java and
PreparedStatement42.java

[1]: https://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#setObject-int-java.lang.Object-java.sql.SQLType-

Vladimir

Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Mon, 15 Jun 2015 23:59:45 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> If it would be up to me, I would try to get rid of pre-processing if any
> possible, since it is a real p.i.t.a., as long as we can find a
different
> solution to provide the same number of supported JDKs and JDBC versions.
>
> The question is: How? Possibly by simply using "JRE8-JDBC42.jar" ALWAYS?

For Java 6 and 7 it would be possible to use the same codebase (if you
ignore certain types iirc), but for Java 8 you need a separate library due
to the introduction of the SQLType interface. And if you fully support JDBC
4.2, you also need the classes from java.time.

> Has anybody tried whether it is possible to simply load a
JRE8-JDBC42.jar
> on
> JRE6? I mean, not to actually invoke the new JDBC42 APIs, just to load
the
> JAR and invoke the JDBC3 APIs only for example. The APIs themselved are
> backwards compatible, and as long as we don't invoke the new APIs, no
> ClassNotFound should happen (AFAIK the JRE loads classes only at first
> actual instantiation, but not simply because it is contained in a loaded
> .class file as a parameter). I mean, as long as we do not use JRE8-only
> APIs
> inside the Driver, and as long as we don't write the .class files in
JRE8
> byte code, certainly.
>
> Or did I miss something in this theoretical approach?

JDBC APIs themselves are not backwards compatible (ie: they introduce new
types in the signature, or require you to handle new types in existing
methods), the API is only backwards compatible from the perspective of the
user.

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Mon, 15 Jun 2015 18:42:13 -0400, Sehrope Sarkuni <sehrope@jackdb.com>
wrote:
> I think a purely Maven based build with separate targets should be
possible
> via a generated-sources plugin. Off the top of my head I'm not sure
which
> plugins it would use though. It's been a while since I wrote anything
like
> that though I do remember it being a bit of a pain to get right. On the
> plus side we only have to figure it out once right? :)

My experience is that Ant gives you a lot more flexibility. For Jaybird I
considered moving to a Maven based build, but I finally decided against it
because it was too much hassle. The difference with the PostgreSQL JDBC is
that Jaybird uses JDBC version specific sources-folders with common classes
(and abstract classes for common implementation).

> I'm a big fan of Mavenizing the build process. A lot of the value of it
> will come from how it will simplify things like adding tests. It
eliminates
> a lot of the double and sometimes triple entry (i.e. add the test class,
> add it to a suite, add the suite to build.xml).

You don't need Maven to achieve that. For one you don't need testsuites,
with JUnit 3 it is a bit harder, but you could use a consistent naming
convention and filter tests in the Ant plugin, with JUnit 4 you could use
(class or instance) rules, or filtering based on annotations.

Mark


Re: Pre-processing during build

From
Scott Morgan
Date:
Hi All,

   I just thought I should mention that I have been writing some tools that will eventually compete with ant-junit and maven-junit builds.   The main reason is to add concurrency everywhere, along with other things like integrated code coverage in tests;

  It will be a while before this is well documented and a plausible replacement (and Fabricate will only work with Git for several years), but it may be worth the wait.   The time cost of converting a build and test API is usually quite immense.

Cheers,
Scott

On Tue, Jun 16, 2015 at 2:08 AM, Mark Rotteveel <mark@lawinegevaar.nl> wrote:
On Mon, 15 Jun 2015 18:42:13 -0400, Sehrope Sarkuni <sehrope@jackdb.com>
wrote:
> I think a purely Maven based build with separate targets should be
possible
> via a generated-sources plugin. Off the top of my head I'm not sure
which
> plugins it would use though. It's been a while since I wrote anything
like
> that though I do remember it being a bit of a pain to get right. On the
> plus side we only have to figure it out once right? :)

My experience is that Ant gives you a lot more flexibility. For Jaybird I
considered moving to a Maven based build, but I finally decided against it
because it was too much hassle. The difference with the PostgreSQL JDBC is
that Jaybird uses JDBC version specific sources-folders with common classes
(and abstract classes for common implementation).

> I'm a big fan of Mavenizing the build process. A lot of the value of it
> will come from how it will simplify things like adding tests. It
eliminates
> a lot of the double and sometimes triple entry (i.e. add the test class,
> add it to a suite, add the suite to build.xml).

You don't need Maven to achieve that. For one you don't need testsuites,
with JUnit 3 it is a bit harder, but you could use a consistent naming
convention and filter tests in the Ant plugin, with JUnit 4 you could use
(class or instance) rules, or filtering based on annotations.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
Dave> Apparently https://github.com/brettwooldridge/HikariCP uses this method;

As far as I understand, HikariCP is using dynamic bytecode generation,
so they do not have to include all the signatures in their sources.
On contrary, we need to implement 4.2 interfaces, so at some point we
would end up with SQLType in pgjdbc's signatures.

Mark> My experience is that Ant gives you a lot more flexibility

What that flexibility is good for?

The drawbacks of Ant are:
1) No "easy way to configure IDE". This includes "download javadoc and
source", adding dependencies to the classpath, etc, etc.

2) No easy way to run tests. With maven you just hit `mvn install` or
`mvn test` and it just works. With Ant you have to read instructions.

3) No easy way to test different versions. With maven, I can depend on
"snapshot" versions in my client application, and I easily can
debug&step-into.
Unfortunately, due to #1, debugging dependencies is not that easy.

4) No easy integration with other systems. For instance, if using
maven you can just add findbugs, sonar, etc, etc.
If you want to try recent Facebook's infer, you just hit `infer mvn install`

Well, gradle might be even better approach, however I have not used it
yet, so I have nothing to say here.

Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:

I never said I want to use different byte code versions. I actually want to use latest JRE API version but compiled down to JRE 6 byte code. Should be possible since technically unrelated as long as we do not use lambda expressions within your implementation source code.

 

 

From: davecramer@gmail.com [mailto:davecramer@gmail.com] On Behalf Of Dave Cramer
Sent: Dienstag, 16. Juni 2015 00:17
To: Markus KARG
Cc: List
Subject: Re: [JDBC] Pre-processing during build

 

How can you use the same name and different bytecode versions ?


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

 

On 15 June 2015 at 17:59, Markus KARG <markus@headcrashing.eu> wrote:

Stephen,

 

thank you for starting this thread.

 

If it would be up to me, I would try to get rid of pre-processing if any possible, since it is a real p.i.t.a., as long as we can find a different solution to provide the same number of supported JDKs and JDBC versions.

 

The question is: How? Possibly by simply using "JRE8-JDBC42.jar" ALWAYS?

 

Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6? I mean, not to actually invoke the new JDBC42 APIs, just to load the JAR and invoke the JDBC3 APIs only for example. The APIs themselved are backwards compatible, and as long as we don't invoke the new APIs, no ClassNotFound should happen (AFAIK the JRE loads classes only at first actual instantiation, but not simply because it is contained in a loaded .class file as a parameter). I mean, as long as we do not use JRE8-only APIs inside the Driver, and as long as we don't write the .class files in JRE8 byte code, certainly.

 

Or did I miss something in this theoretical approach?

 

Regards

-Markus

 

Re: Pre-processing during build

From
"Markus KARG"
Date:

To make things a bit more clear: It is NOT about whether or not an interface method has a parameter of an unknown type. It is solely about whether that method is actually INVOKED at runtime. As a JDBC42 is unknown to a JDBC3 client, that client CANNOT invoke that method, hence NEVER a ClassNotFoundException can ever happen. Actually the cause of the exception is not even the method invocation but it is solely the instantiation of the parameter value which happens IN THE CALLER. We're pretty safe I think. So still I do not see any need for separate JARs for the purpose of serving multiple JDBC levels, as long as the byte code is Java 6 level.

 

From: Sehrope Sarkuni [mailto:sehrope@jackdb.com]
Sent: Dienstag, 16. Juni 2015 00:42
To: Dave Cramer
Cc: Markus KARG; List
Subject: Re: [JDBC] Pre-processing during build

 

On Mon, Jun 15, 2015 at 6:17 PM, Dave Cramer <pg@fastcrypt.com> wrote:

How can you use the same name and different bytecode versions ?

 

Short answer is you can't.

 

Longer answer is, as long as the classes don't use types that aren't defined in the newer interface, you can include implementations for methods that aren't defined in an older interface definition. It's only a problem if those methods use types that don't exist (ex: if a function signature used the new Java 8 time types). You'd also need to either compile with the older target bytecode version or actually build multiple versions off the same source.

 

I think a purely Maven based build with separate targets should be possible via a generated-sources plugin. Off the top of my head I'm not sure which plugins it would use though. It's been a while since I wrote anything like that though I do remember it being a bit of a pain to get right. On the plus side we only have to figure it out once right? :)

 

I'm a big fan of Mavenizing the build process. A lot of the value of it will come from how it will simplify things like adding tests. It eliminates a lot of the double and sometimes triple entry (i.e. add the test class, add it to a suite, add the suite to build.xml).

 

Regards,

-- Sehrope Sarkuni

Founder & CEO | JackDB, Inc. | https://www.jackdb.com/

 

Re: Pre-processing during build

From
"Markus KARG"
Date:
Vladimir,

you misunderstood my proposal. I do not want to have multiple versions of the same class. I want ONLY JDBC42 API
compileddown to Java 6 byte code in one single JAR. JDBC42 is backwards compatible to all older JDBC levels, and Java 7
and8 can load Java 6 byte code. 

Also the idea is NOT to allow Java 6 and 7 clients to INVOKE JDBC42 but just to load the JAR and invoke e. g. JDBC4
methodsthat are part of JDBC42 still (again, it's backwards compatible). 

Regarding your example: A Java 7 client will never try to call that methods as Java 7 clients do not even know that
sucha method exist. Only Java 8 clients will know. That method is more than clearly marked as "since 1.8" so a Java 7
applicationprogrammer does not even KNOW about it pure existence! 

So it boils down to verification and I doubt that the bytecode verifier will try to actually load java.sql.Type class.
Haveyou really tried this out or do you have another link where it is written that the byte code verifier will CHECK
theexistence of a parameter class when it verifies the loaded class? 

Regards
Markus

-----Original Message-----
From: Vladimir Sitnikov [mailto:sitnikov.vladimir@gmail.com]
Sent: Dienstag, 16. Juni 2015 00:48
To: Dave Cramer
Cc: Markus KARG; List
Subject: Re: [JDBC] Pre-processing during build

Marcus,

>Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6?

As you are a fan of JEPs, you might know of
http://openjdk.java.net/jeps/238 : Multi-Version JAR Files
It does not "just work" yet.

Suppose you want implement PreparedStatement(...java.sql.SQLType)
features of JDBC 4.2 (see [1]).
Even if you compile that with target 1.7, JRE 7 might fail to load the
class as it won't be able to validate what that SQLType is. It just
does not exist in JRE7.

As far as I understand, the only manageable way of using "new
features" in "JDK6-7 jars" is to isolate JDK8-using methods to
JDK8-only-loaded classes. For instance, PreparedStatement41.java and
PreparedStatement42.java

[1]:
https://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#setObject-int-java.lang.Object-java.sql.SQLType-

Vladimir



Re: Pre-processing during build

From
"Markus KARG"
Date:

The answer is pretty simple: Try it out. :-)

 

Just compile a JRE 8 class down to byte code level 6 and load it on Java level 7. That's what I proposed. Nothing else. It really bet will work unless you try to INSTANTIATE JRE-only classes, but it should LOAD. And nothing more we need.

 

From: davecramer@gmail.com [mailto:davecramer@gmail.com] On Behalf Of Dave Cramer
Sent: Dienstag, 16. Juni 2015 00:54
To: Vladimir Sitnikov
Cc: Markus KARG; List
Subject: Re: [JDBC] Pre-processing during build

 

I'm wondering if we use the latest JRE to compile but target 1.7 will it use the latest JDBC API, but compile to 1.7 bytecode. Apparently https://github.com/brettwooldridge/HikariCP uses this method; and they have a similar problem


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

 

On 15 June 2015 at 18:47, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:

Marcus,

>Has anybody tried whether it is possible to simply load a JRE8-JDBC42.jar on JRE6?

As you are a fan of JEPs, you might know of
http://openjdk.java.net/jeps/238 : Multi-Version JAR Files
It does not "just work" yet.

Suppose you want implement PreparedStatement(...java.sql.SQLType)
features of JDBC 4.2 (see [1]).
Even if you compile that with target 1.7, JRE 7 might fail to load the
class as it won't be able to validate what that SQLType is. It just
does not exist in JRE7.

As far as I understand, the only manageable way of using "new
features" in "JDK6-7 jars" is to isolate JDK8-using methods to
JDK8-only-loaded classes. For instance, PreparedStatement41.java and
PreparedStatement42.java

[1]: https://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#setObject-int-java.lang.Object-java.sql.SQLType-

Vladimir

 

Re: Pre-processing during build

From
"Markus KARG"
Date:
Well, we actually opened this discussion because we WANT particularly Maven and we DO NOT want the flexibility of ANT
butfollow CoC, hence make PGJDBC at-most a simple, "standard" project with as less as possible customizations... The
lessflexibility you have the more standards you follow the easier new programmer's can get at speed with the project.
Moreflexiblity = more complexity = more customization = longer time to get about = less new programmers will join.
That'sthe idea behind this thread, actually. 

-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Mark Rotteveel
Sent: Dienstag, 16. Juni 2015 09:09
To: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Pre-processing during build

On Mon, 15 Jun 2015 18:42:13 -0400, Sehrope Sarkuni <sehrope@jackdb.com>
wrote:
> I think a purely Maven based build with separate targets should be
possible
> via a generated-sources plugin. Off the top of my head I'm not sure
which
> plugins it would use though. It's been a while since I wrote anything
like
> that though I do remember it being a bit of a pain to get right. On the
> plus side we only have to figure it out once right? :)

My experience is that Ant gives you a lot more flexibility. For Jaybird I
considered moving to a Maven based build, but I finally decided against it
because it was too much hassle. The difference with the PostgreSQL JDBC is
that Jaybird uses JDBC version specific sources-folders with common classes
(and abstract classes for common implementation).

> I'm a big fan of Mavenizing the build process. A lot of the value of it
> will come from how it will simplify things like adding tests. It
eliminates
> a lot of the double and sometimes triple entry (i.e. add the test class,
> add it to a suite, add the suite to build.xml).

You don't need Maven to achieve that. For one you don't need testsuites,
with JUnit 3 it is a bit harder, but you could use a consistent naming
convention and filter tests in the Ant plugin, with JUnit 4 you could use
(class or instance) rules, or filtering based on annotations.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
"Markus KARG"
Date:

Scott, I simply cannot see what this information has to do with the current thread? Can you please elaborate? Thanks.

-Markus

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Scott Morgan
Sent: Dienstag, 16. Juni 2015 15:12
To: Mark Rotteveel; pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Pre-processing during build

 

Hi All,

 

   I just thought I should mention that I have been writing some tools that will eventually compete with ant-junit and maven-junit builds.   The main reason is to add concurrency everywhere, along with other things like integrated code coverage in tests;

 

  It will be a while before this is well documented and a plausible replacement (and Fabricate will only work with Git for several years), but it may be worth the wait.   The time cost of converting a build and test API is usually quite immense.

 

Cheers,

Scott

 

On Tue, Jun 16, 2015 at 2:08 AM, Mark Rotteveel <mark@lawinegevaar.nl> wrote:

On Mon, 15 Jun 2015 18:42:13 -0400, Sehrope Sarkuni <sehrope@jackdb.com>
wrote:
> I think a purely Maven based build with separate targets should be
possible
> via a generated-sources plugin. Off the top of my head I'm not sure
which
> plugins it would use though. It's been a while since I wrote anything
like
> that though I do remember it being a bit of a pain to get right. On the
> plus side we only have to figure it out once right? :)

My experience is that Ant gives you a lot more flexibility. For Jaybird I
considered moving to a Maven based build, but I finally decided against it
because it was too much hassle. The difference with the PostgreSQL JDBC is
that Jaybird uses JDBC version specific sources-folders with common classes
(and abstract classes for common implementation).

> I'm a big fan of Mavenizing the build process. A lot of the value of it
> will come from how it will simplify things like adding tests. It
eliminates
> a lot of the double and sometimes triple entry (i.e. add the test class,
> add it to a suite, add the suite to build.xml).

You don't need Maven to achieve that. For one you don't need testsuites,
with JUnit 3 it is a bit harder, but you could use a consistent naming
convention and filter tests in the Ant plugin, with JUnit 4 you could use
(class or instance) rules, or filtering based on annotations.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

 

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
> The answer is pretty simple: Try it out. :-)
>
> Just compile a JRE 8 class down to byte code level 6 and load it on Java
> level 7. That's what I proposed. Nothing else. It really bet will work
> unless you try to INSTANTIATE JRE-only classes, but it should LOAD. And
> nothing more we need.

Markus, can you please be more explicit in your suggestion?

I did try a simple "Hello, world" and it does not run in stock JDKs of
MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
Even if you manage to make that fly, that would be built on sand.

If you read Spring java support blog post
(https://spring.io/blog/2015/04/03/how-spring-achieves-compatibility-with-java-6-7-and-8),
you'll spot that they use separate _classes_ to isolate JDK8 features.

Vladimir


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Tue, 16 Jun 2015 17:19:27 +0300, Vladimir Sitnikov
<sitnikov.vladimir@gmail.com> wrote:
> Mark> My experience is that Ant gives you a lot more flexibility
>
> What that flexibility is good for?

I am not here to start a "my favorite build system"-war, I am just saying
that for a project like a JDBC driver, which needs to target different
versions of Java, the flexibility of Ant may be necessary, or getting it
working and working correctly on Maven may be more complex, especially as
Ant is already in place and working.

> The drawbacks of Ant are:
> 1) No "easy way to configure IDE". This includes "download javadoc and
> source", adding dependencies to the classpath, etc, etc.

Good point.

> 2) No easy way to run tests. With maven you just hit `mvn install` or
> `mvn test` and it just works. With Ant you have to read instructions.

That is a matter of using targetnames that follow a convention, correct
dependencies and a sane default target (in which case just running ant
could be sufficient to build and test).

> 3) No easy way to test different versions. With maven, I can depend on
> "snapshot" versions in my client application, and I easily can
> debug&step-into.
> Unfortunately, due to #1, debugging dependencies is not that easy.

Good point, you can achieve that with Ant+Ivy, though that will complicate
IDE integration further.

> 4) No easy integration with other systems. For instance, if using
> maven you can just add findbugs, sonar, etc, etc.
> If you want to try recent Facebook's infer, you just hit `infer mvn
> install`

SonarQube, findbugs and Infer support ant. It might require more explicit
config than in Maven, but that should be setup once and forget, for infer:
infer -- ant <your build target>

> Well, gradle might be even better approach, however I have not used it
> yet, so I have nothing to say here.

I use Gradle as well, but I don't know it too well yet. It is a bit of a
maven/ant hybrid, and I am not yet sure if I'd prefer it over Ant (or
Maven).

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Tue, 16 Jun 2015 21:18:17 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> To make things a bit more clear: It is NOT about whether or not an
> interface method has a parameter of an unknown type. It is solely about
> whether that method is actually INVOKED at runtime. As a JDBC42 is
unknown
> to a JDBC3 client, that client CANNOT invoke that method, hence NEVER a
> ClassNotFoundException can ever happen. Actually the cause of the
exception
> is not even the method invocation but it is solely the instantiation of
the
> parameter value which happens IN THE CALLER. We're pretty safe I think.
So
> still I do not see any need for separate JARs for the purpose of serving
> multiple JDBC levels, as long as the byte code is Java 6 level.

When I tried it in the past (for Jaybird) I ran into problems, it might
have been an IncompatibleClassChangeError (or subclass) on classloading
time, but I can't remember the exact details. I tried reading the JLS and
JVM spec on this point, but I find it hard to come to a conclusion, so I
will try a small experiment this weekend.

However that is half of your problem. The other half is that you risk
using classes, methods or maybe even features that are not available in a
lower Java version and that will not be detected compile time, but only at
runtime (and not just in newly added JDBC 4.2 methods).

On top of that, for a compliant JDBC 4.2 implementation, you must support
java.time objects in setObject, getObject, updateObject, etc. If you want
to do this in a unified source, you will need to load a delegated
implementation that is appropriate for the Java version, otherwise you will
get ClassNotFoundException.

For example in Jaybird we did this:
https://github.com/FirebirdSQL/jaybird/blob/master/src/main/org/firebirdsql/jdbc/field/FBField.java#L739

Which uses (for JDBC 4.2):
https://github.com/FirebirdSQL/jaybird/blob/master/src/jdbc_42/org/firebirdsql/jdbc/field/JDBC42ObjectConverter.java

And for JDBC 4.1 (and 4.0 in Jaybird 2.2):
https://github.com/FirebirdSQL/jaybird/blob/master/src/main/org/firebirdsql/jdbc/field/DefaultObjectConverter.java

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Tue, 16 Jun 2015 21:26:12 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> So it boils down to verification and I doubt that the bytecode verifier
> will try to actually load java.sql.Type class. Have you really tried
this
> out or do you have another link where it is written that the byte code
> verifier will CHECK the existence of a parameter class when it verifies
the
> loaded class?

It looks like the JLS and JVM specification allow for resolution of
symbolic references at classloading or "first use" and a JVM implementation
is free to choose: see
http://docs.oracle.com/javase/specs/jls/se8/html/jls-12.html#jls-12.1.2 and
http://docs.oracle.com/javase/specs/jls/se8/html/jls-12.html#jls-12.3

Although I am not entirely sure about this, I interpret this that loading
a class that has a method whose signature includes a type that is not
available (like SQLType on Java 7 or lower) could work on one JVM
implementation (late resolution), but not on others (early resolution).

Mark



Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 00:02:40 +0300, Vladimir Sitnikov
<sitnikov.vladimir@gmail.com> wrote:
> 2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
>> The answer is pretty simple: Try it out. :-)
>>
>> Just compile a JRE 8 class down to byte code level 6 and load it on
Java
>> level 7. That's what I proposed. Nothing else. It really bet will work
>> unless you try to INSTANTIATE JRE-only classes, but it should LOAD. And
>> nothing more we need.
>
> Markus, can you please be more explicit in your suggestion?
>
> I did try a simple "Hello, world" and it does not run in stock JDKs of
> MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
> Even if you manage to make that fly, that would be built on sand.

Good example: it demonstrates at least that using reflection (eg
getDeclaredMethods; or in this case privateGetDeclaredMethods) will lead to
a NoClassDefFoundError.

Mark


Re: Pre-processing during build

From
Dave Cramer
Date:

On 17 June 2015 at 03:07, Mark Rotteveel <mark@lawinegevaar.nl> wrote:
On Wed, 17 Jun 2015 00:02:40 +0300, Vladimir Sitnikov
<sitnikov.vladimir@gmail.com> wrote:
> 2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
>> The answer is pretty simple: Try it out. :-)
>>
>> Just compile a JRE 8 class down to byte code level 6 and load it on
Java
>> level 7. That's what I proposed. Nothing else. It really bet will work
>> unless you try to INSTANTIATE JRE-only classes, but it should LOAD. And
>> nothing more we need.
>
> Markus, can you please be more explicit in your suggestion?
>
> I did try a simple "Hello, world" and it does not run in stock JDKs of
> MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
> Even if you manage to make that fly, that would be built on sand.

Good example: it demonstrates at least that using reflection (eg
getDeclaredMethods; or in this case privateGetDeclaredMethods) will lead to
a NoClassDefFoundError.


I'm not sure this is a great example as Optional itself is a java 8 construct.

Either way Spring is able to do this, as are others?


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca
 

Re: Pre-processing during build

From
Sehrope Sarkuni
Date:
On Wed, Jun 17, 2015 at 6:15 AM, Dave Cramer <pg@fastcrypt.com> wrote:
I'm not sure this is a great example as Optional itself is a java 8 construct.

Either way Spring is able to do this, as are others?
 
The approach used by Spring won't work for the JDBC driver. The crux of the issue is that the newest version of the JDBC spec include Java 8 types in method signatures of public interfaces that exist in Java . Spring doesn't do that.

The public interfaces and classes for the older JDK versions they support (i.e. 6 or 7) only expose types that exist in those JDK versions. For older classes they've added internal support for Java 8 types that is dynamically checked, but it's done by wrapping the integration in an inner class. Here's an example: https://github.com/spring-projects/spring-framework/blob/f41de12cf62aebf1be9b30be590c12eb2c030853/spring-beans/src/main/java/org/springframework/beans/AbstractNestablePropertyAccessor.java#L1041

There's no way to make that work when a public interface exposes classes that won't exist on the run time. It may have been possible with older upgrades to the JDBC spec (ex: 4 to 4.1) as there weren't any JDK 1.7-only classes used in methods signatures of existing public interfaces. Compiling with an older bytecode target would allow and older JDK to simply ignore those methods as they would not be part of the public signature.

In JDBC 4.2 that's not true though. For example the JDBC 4.2 PreparedStatement class has a new setObject(...) that uses a Java 8 only class:



That method signature can't appear in a driver that is going to be used in JDK 6 or 7. There's no way to hide it internally as it's part of the public signature.

We're going to need some kind of preprocessing step to handle things like this.

Regards,
-- Sehrope Sarkuni
Founder & CEO | JackDB, Inc. | https://www.jackdb.com/

Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 06:15:47 -0400, Dave Cramer <pg@fastcrypt.com> wrote:
> On 17 June 2015 at 03:07, Mark Rotteveel <mark@lawinegevaar.nl> wrote:
>
>> On Wed, 17 Jun 2015 00:02:40 +0300, Vladimir Sitnikov
>> <sitnikov.vladimir@gmail.com> wrote:
>> > 2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
>> >> The answer is pretty simple: Try it out. :-)
>> >>
>> >> Just compile a JRE 8 class down to byte code level 6 and load it on
>> Java
>> >> level 7. That's what I proposed. Nothing else. It really bet will
work
>> >> unless you try to INSTANTIATE JRE-only classes, but it should LOAD.
>> >> And
>> >> nothing more we need.
>> >
>> > Markus, can you please be more explicit in your suggestion?
>> >
>> > I did try a simple "Hello, world" and it does not run in stock JDKs
of
>> > MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
>> > Even if you manage to make that fly, that would be built on sand.
>>
>> Good example: it demonstrates at least that using reflection (eg
>> getDeclaredMethods; or in this case privateGetDeclaredMethods) will
lead
>> to
>> a NoClassDefFoundError.
>>
>>
> I'm not sure this is a great example as Optional itself is a java 8
> construct.

Yes, and so is java.sql.SQLType. So if this doesn't work for Optional, it
also won't work for SQLType.

> Either way Spring is able to do this, as are others?

Spring uses a lot of reflection, proxies, byte code generation, etc to get
things done. I am not sure if you want to go that way.

Mark


Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>We're going to need some kind of preprocessing step to handle things like this.

Sehrope, Marr, sorry, I do not follow you.
I do not see how you prove preprocessing is required.

For instance,

org.postgresql.jdbc42.PreparedStatement extends
org.postgresql.jdbc4.PreparedStatement {
  public setObject(int parameterIndex, Object x, SQLType targetSqlType) {
     ...
  }
}

It does not require "reflection, proxies, byte code generation", etc, etc.
It is just a simple "extends".

Vladimir


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 14:17:51 +0300, Vladimir Sitnikov
<sitnikov.vladimir@gmail.com> wrote:
>>We're going to need some kind of preprocessing step to handle things
like
>>this.
>
> Sehrope, Marr, sorry, I do not follow you.
> I do not see how you prove preprocessing is required.

I was talking about the situation where you have a single implementation
of the interfaces, but even the way of working below has its problems.

> For instance,
>
> org.postgresql.jdbc42.PreparedStatement extends
> org.postgresql.jdbc4.PreparedStatement {
>   public setObject(int parameterIndex, Object x, SQLType targetSqlType)
{
>      ...
>   }
> }
>
> It does not require "reflection, proxies, byte code generation", etc,
etc.
> It is just a simple "extends".

That is almost what PostgreSQL uses now, but this is not going to work if
you compile with Java 8 and assume that
org.postgresql.jdbc4.PreparedStatement would then also be new-able when the
same jar is used under Java 6 or 7, because
org.postgresql.jdbc4.PreparedStatement would need to be abstract at compile
time as it doesn't contain the methods required by the Java 8 (JDBC 4.2)
API during compilation.

So getting this to work would need some form of reflection (to get the
right type at runtime based on the Java version), some preprocessing (as
done currently) to get around the compilation problem or some byte code
generation/modification to "unabstract"
org.postgresql.jdbc4.PreparedStatement after compilation, or some form of
tiered compilation (where the org.postgresql.jdbc4.PreparedStatement is
compiled with Java 7, and org.postgresql.jdbc42.PreparedStatement with Java
8; this might be more complex than the existing solution.

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 13:52:52 +0200, Mark Rotteveel <mark@lawinegevaar.nl>
wrote:
> That is almost what PostgreSQL uses now, but this is not going to work
if
> you compile with Java 8 and assume that
> org.postgresql.jdbc4.PreparedStatement would then also be new-able when
the
> same jar is used under Java 6 or 7, because
> org.postgresql.jdbc4.PreparedStatement would need to be abstract at
compile
> time as it doesn't contain the methods required by the Java 8 (JDBC 4.2)
> API during compilation.
>
> So getting this to work would need some form of reflection (to get the
> right type at runtime based on the Java version), some preprocessing (as
> done currently) to get around the compilation problem or some byte code
> generation/modification to "unabstract"
> org.postgresql.jdbc4.PreparedStatement after compilation, or some form
of
> tiered compilation (where the org.postgresql.jdbc4.PreparedStatement is
> compiled with Java 7, and org.postgresql.jdbc42.PreparedStatement with
Java
> 8; this might be more complex than the existing solution.

I just realized it might actually work: some (maybe all) methods added in
the JDBC API for Java 8 were added as default interface methods (with an
implementation that throws UnsupportedOperationException), so compilation
would succeed for org.postgresql.jdbc4.PreparedStatement without having an
implementation for the new methods.

You'd still need reflection or an other trick to decide based on the Java
version which classes (Statement, PreparedStatement, ResultSet, etc) to
instantiate.

Mark


Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>So getting this to work would need some form of reflection (to get the
right type at runtime based on the Java version)

Reflection is sufficient, isn't it?

I think reflection would be rather clean solution here.
It is more IDE-friendly and developer-friendly than pre-processing.
Reflection is much easier to debug than byte-code generation.

So, do you see drawbacks with using reflection to select the specific
implementation?

From implementation point of view it just replaces "preprocessing"
with some "Class.forName" and that is it.
It even allows to ship the same jar file and select implementation on the fly.

Vladimir


Re: Pre-processing during build

From
Christopher BROWN
Date:
As has been said by Markus KARG and others, you CAN produce a driver using bytecode for Java-6 / JDBC-4.1 including Java-8 / JDBC-4.2 (and Java-7 / JDBC 4.1) types and method signatures.  I've used this technique in many production applications, and have tried out this specific case just now, as a check.

So, for example, if you use "javac" from Java-8 and even include the Java-8 (JDBC-4.2) definition of PreparedStatement, compiling with source=1.6 and target=1.6 options, your JDBC-4.2 implementation of PreparedStatement WILL load into Java-6 with JDBC-4.  You can even annotate your implementations with @Override, no problem.  You will NOT have a problem with clients that expect a Java-6 / JDBC-4 API because there is no way you can compile such a client to invoke a JDBC-4.2 method (it would be a compiler error).  Java only tries to resolve classes on-demand, that is when it runs a code branch in a method body that refers to a type or invokes a method with such a type as part of its signature, and NOT when loading or instantiating your class.  If you never call it, you'll never have a problem.

You WILL have problems however in the following (avoidable) cases:

- if your implementation of a JDBC-4 driver calls code that in turn refers to types, fields, or methods that depend on a more recent method of the Java API, for example :
  - static initialization
  - constructor calls to code that depends on a more recent API version
  - an implementation of a JDBC-4 method that calls a JDBC-4.1 or -4.2 method (typically method overloading with the noble intention of avoiding copying-and-pasting code)
  - as has been suggested, the safest workaround is to just use "extends" where appropriate, instance of generating code from templates
- use of reflection (or proxies) to examine classes or invoke methods
- use of BeansIntrospector

The problem is not in compiling, it's about ensuring that once client code invokes a JDBC-4 method, that your implementation of that method doesn't call in turn any code that it shouldn't.

Code coverage metrics are an additional guarantee but you'd have to be very sure you've got correct coverage for all version-dependant code paths.  Compiler constraints are probably safer ; I'll discuss that in a moment.

First, a few remarks concerning some of the previous posts :

https://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#setObject-int-java.lang.Object-java.sql.SQLType- is actually implemented as a Java-8 "default" method.  You don't need to implement it directly in the driver.

https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213 is a doomed-to-fail example, not due to bytecode versions but because Java uses reflection (see above list of problems) to find your "main" method, and so trips up on the method using Java-8 types.  Restructured as follows (two classes with separate source files), it works:

----8<---- Jre7Test.java ----8<----

public class Jre7Test {
    public static void main(String args[]) {
        System.out.println(Jre7TestCompanion.greeting());
    }
}

----8<---- Jre7TestCompanion.java ----8<----

import java.time.Duration;
import java.util.Optional;
 
public class Jre7TestCompanion {
    public static Optional<Duration> optional(java.time.Duration duration) {
        return Optional.of(duration);
    }
 
    public static String greeting() {
        return "Hello, world";
    }
}

----8<--------8<----

(the above compiled and run with the exact same commands on Mac OS X too).

The safest way is to use incremental compilation (all integrated into a single automated build, with no preference for build tool).  Using fictional package and class names to demonstrate the idea, here's how it could be done.

For example, produce an intermediate "pgjdbc_4.0.jar" using a "JDBC-4" package (1.6 API as bootstrap classpath for "javac", with 1.8 compiler and source/target 1.6):

jdbc_4_0.PGDriver_4_0
jdbc_4_0.PGPreparedStatement_4_0
jdbc_4_0.PGResultSet_4_0
...etc

Add the resulting "jar" to the classpath for the next step, with classes that extend the above, producing "pgjdbc_4.1.jar" (useless unless "pgjdbc_4.0.jar" is also in the classpath).  (1.7 API as bootstrap classpath for "javac", with 1.8 compiler and source/target 1.6)

jdbc_4_1.PGDriver_4_1 extends jdbc_4_0.PGDriver_4_0
jdbc_4_1.PGPreparedStatement_4_1 extends jdbc_4_0.PGPreparedStatement_4_0
jdbc_4_1.PGResultSet_4_1 extends jdbc_4_0.PGResultSet_4_0
...etc

Then add the both resulting "jar" to the classpath for the next step, with classes that extend the above, producing "pgjdbc_4.2.jar" (useless unless "pgjdbc_4.0.jar" is also in the classpath, along with "pgjdbc_4.1.jar"). (1.8 API as bootstrap classpath for "javac", with 1.8 compiler and source/target 1.6)

jdbc_4_2.PGDriver_4_2 extends jdbc_4_1.PGDriver_4_1
jdbc_4_2.PGPreparedStatement_4_2 extends jdbc_4_1.PGPreparedStatement_4_1
jdbc_4_2.PGResultSet_4_2 extends jdbc_4_1.PGResultSet_4_1
...etc

Then, merge all JARs into a single JAR.  Clients could then refer to the specific driver version they require in code, or use a generic Driver class that (in the constructor) detects the appropriate JDBC version and fixes a "final" int or Enum field, used thereafter in "switch" blocks to call the appropriate driver version, acting as a lightweight proxy when the specific driver version can't be referred to (for backwards compatibility).  More adventurous developers might even suggest usage of method handles from Java 7 onwards to eliminate the negligeable overhead of a switch statements, but I'd personally rely on the JVM to optimise that away. Note that this is only necessary for the Driver implementation, as no-one (apart from the driver implementors) should ever call "new PreparedStatement" or whatever.

Hope that helps ; hope it's not redundant with regards to messages sent since I started typing away my 2 cents...  In any case, I regularly use these techniques in production code with no accidents.

--
Christopher



On 17 June 2015 at 13:05, Sehrope Sarkuni <sehrope@jackdb.com> wrote:
On Wed, Jun 17, 2015 at 6:15 AM, Dave Cramer <pg@fastcrypt.com> wrote:
I'm not sure this is a great example as Optional itself is a java 8 construct.

Either way Spring is able to do this, as are others?
 
The approach used by Spring won't work for the JDBC driver. The crux of the issue is that the newest version of the JDBC spec include Java 8 types in method signatures of public interfaces that exist in Java . Spring doesn't do that.

The public interfaces and classes for the older JDK versions they support (i.e. 6 or 7) only expose types that exist in those JDK versions. For older classes they've added internal support for Java 8 types that is dynamically checked, but it's done by wrapping the integration in an inner class. Here's an example: https://github.com/spring-projects/spring-framework/blob/f41de12cf62aebf1be9b30be590c12eb2c030853/spring-beans/src/main/java/org/springframework/beans/AbstractNestablePropertyAccessor.java#L1041

There's no way to make that work when a public interface exposes classes that won't exist on the run time. It may have been possible with older upgrades to the JDBC spec (ex: 4 to 4.1) as there weren't any JDK 1.7-only classes used in methods signatures of existing public interfaces. Compiling with an older bytecode target would allow and older JDK to simply ignore those methods as they would not be part of the public signature.

In JDBC 4.2 that's not true though. For example the JDBC 4.2 PreparedStatement class has a new setObject(...) that uses a Java 8 only class:



That method signature can't appear in a driver that is going to be used in JDK 6 or 7. There's no way to hide it internally as it's part of the public signature.

We're going to need some kind of preprocessing step to handle things like this.

Regards,
-- Sehrope Sarkuni
Founder & CEO | JackDB, Inc. | https://www.jackdb.com/


Re: Pre-processing during build

From
dmp
Date:
Christopher BROWN wrote:

> Then, merge all JARs into a single JAR.  Clients could then refer to the
> specific driver version they require in code, or use a generic Driver class that
> (in the constructor) detects the appropriate JDBC version and fixes a "final"
> int or Enum field, used thereafter in "switch" blocks to call the appropriate
> driver version, acting as a lightweight proxy when the specific driver version
> can't be referred to (for backwards compatibility).  More adventurous developers
 > ..........................................

Somehow as someone who manages a generic database access tool I don't like the
sounds of this requirement. Why as a client developer should I have to detect
the appropriate Java Version then somehow figure out the user's requirement
for the Driver class to call in your JDBC? I don't have to do this for any other
database so why for PostgreSQL's JDBC.

It may be of no concern really, but that is going to require me to change
the coding in my client for instantiating your Driver class, which is the
same for all databases so far, all so that you can change your build process,
which does not appear to be broken.

How about backup and state the one, two, three pros, and cons for initiating
the change in the build process again. Then highlight what additional work
would be required in the code, etc. to accomplish the new build process. Then
the list could input on the proposal. Maybe that has already taken place and
I missed it?

danap.

> ~
 > ~
 > ~
 > ~
 > ~
> Hope that helps ; hope it's not redundant with regards to messages sent since I
> started typing away my 2 cents...  In any case, I regularly use these techniques
> in production code with no accidents.
>
> --
> Christopher



Re: Pre-processing during build

From
Christopher BROWN
Date:
The idea, for administrators and client developers, is that you wouldn't need to change anything, and you wouldn't need to pick a driver version.

The idea being, that instead of their being different versions of the driver (for different levels of JDBC compatibility) is that there could be one single binary.  For type safety and to avoid creating source files from template pre-processing, the binary package could contain one driver implementation for each implemented JDBC version, each more recent version extending the previous version.  Managing this as a client would obviously be a mess, so my suggestion, if this approach seemed like an appropriate solution (I'm not pushing for it either, but I don't maintain the project...) is that "org.postgresql.Driver" could be reimplemented to delegate to the most recent driver implementation without changing client code.

This could be done (in the driver code, not in client code) using an approach like this (hope the indentation doesn't disappear when sending the e-mail...):

private final java.sql.Driver impl;

// constructor
Driver()
{
  java.sql.Driver impl = null;
  try {
    Class.forName("java.sql.SQLType");
    impl = new PGDriverV8(); // if the above worked => Java 8
  } catch (...) {
  }
  if (impl == null){
    try {
      java.sql.Connection.class.getMethod("setSchema")
      impl = new PGDriverV7(); // else, if the above worked => Java 7
    } catch (...) {
  }
  if (impl == null){
    impl = new PGDriverV6(); // else, fallback to min supported version
  }
}

// methods
public Statement createStatement()
{
  return impl.createStatement();
}

The above should demonstrate the idea, even if it's incomplete.

--
Christopher


On 17 June 2015 at 17:28, dmp <danap@ttc-cmc.net> wrote:
Christopher BROWN wrote:

Then, merge all JARs into a single JAR.  Clients could then refer to the
specific driver version they require in code, or use a generic Driver class that
(in the constructor) detects the appropriate JDBC version and fixes a "final"
int or Enum field, used thereafter in "switch" blocks to call the appropriate
driver version, acting as a lightweight proxy when the specific driver version
can't be referred to (for backwards compatibility).  More adventurous developers
> ..........................................

Somehow as someone who manages a generic database access tool I don't like the
sounds of this requirement. Why as a client developer should I have to detect
the appropriate Java Version then somehow figure out the user's requirement
for the Driver class to call in your JDBC? I don't have to do this for any other
database so why for PostgreSQL's JDBC.

It may be of no concern really, but that is going to require me to change
the coding in my client for instantiating your Driver class, which is the
same for all databases so far, all so that you can change your build process,
which does not appear to be broken.

How about backup and state the one, two, three pros, and cons for initiating
the change in the build process again. Then highlight what additional work
would be required in the code, etc. to accomplish the new build process. Then
the list could input on the proposal. Maybe that has already taken place and
I missed it?

danap.

~
> ~
> ~
> ~
> ~
Hope that helps ; hope it's not redundant with regards to messages sent since I
started typing away my 2 cents...  In any case, I regularly use these techniques
in production code with no accidents.

--
Christopher


Re: Pre-processing during build

From
dmp
Date:
Christopher BROWN wrote:
> The idea, for administrators and client developers, is that you wouldn't need to
> change anything, and you wouldn't need to pick a driver version.
>
> The idea being, that instead of their being different versions of the driver
> (for different levels of JDBC compatibility) is that there could be one single
> binary.  For type safety and to avoid creating source files from template
> pre-processing, the binary package could contain one driver implementation for
> each implemented JDBC version, each more recent version extending the previous
> version.  Managing this as a client would obviously be a mess, so my suggestion,
> if this approach seemed like an appropriate solution (I'm not pushing for it
> either, but I don't maintain the project...) is that "org.postgresql.Driver"
> could be reimplemented to delegate to the most recent driver implementation
> without changing client code.
>
> This could be done (in the driver code, not in client code) using an approach
> like this (hope the indentation doesn't disappear when sending the e-mail...):
>
 > ~
 > ~

Thank you for the clarification.
danap

danap wrote:
>     Somehow as someone who manages a generic database access tool I don't like the
>     sounds of this requirement. Why as a client developer should I have to detect
>     the appropriate Java Version then somehow figure out the user's requirement
>     for the Driver class to call in your JDBC? I don't have to do this for any other
>     database so why for PostgreSQL's JDBC.
>
>     It may be of no concern really, but that is going to require me to change
>     the coding in my client for instantiating your Driver class, which is the
>     same for all databases so far, all so that you can change your build process,
>     which does not appear to be broken.
>
>     How about backup and state the one, two, three pros, and cons for initiating
>     the change in the build process again. Then highlight what additional work
>     would be required in the code, etc. to accomplish the new build process. Then
>     the list could input on the proposal. Maybe that has already taken place and
>     I missed it?
>
>     danap.



Re: Pre-processing during build

From
"Markus KARG"
Date:
>When I tried it in the past (for Jaybird) I ran into problems, it might
have been an IncompatibleClassChangeError (or subclass) on classloading
time, but I can't remember the exact details. I tried reading the JLS and
JVM spec on this point, but I find it hard to come to a conclusion, so I
will try a small experiment this weekend.

Let's base further discussions on your weekend's test results aka hard facts. :-)

>However that is half of your problem. The other half is that you risk
using classes, methods or maybe even features that are not available in a
lower Java version and that will not be detected compile time, but only at
runtime (and not just in newly added JDBC 4.2 methods).

This can most simply be detected by a "jre6 compliance" test drive with explicit JRE6 runtime lib in bootclasspath.

-Markus



Re: Pre-processing during build

From
Dave Cramer
Date:
There is a mavenized pull here https://github.com/pgjdbc/pgjdbc/pull/322 which should make this experiment a bit eaiser

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 17 June 2015 at 14:38, Markus KARG <markus@headcrashing.eu> wrote:
>When I tried it in the past (for Jaybird) I ran into problems, it might
have been an IncompatibleClassChangeError (or subclass) on classloading
time, but I can't remember the exact details. I tried reading the JLS and
JVM spec on this point, but I find it hard to come to a conclusion, so I
will try a small experiment this weekend.

Let's base further discussions on your weekend's test results aka hard facts. :-)

>However that is half of your problem. The other half is that you risk
using classes, methods or maybe even features that are not available in a
lower Java version and that will not be detected compile time, but only at
runtime (and not just in newly added JDBC 4.2 methods).

This can most simply be detected by a "jre6 compliance" test drive with explicit JRE6 runtime lib in bootclasspath.

-Markus



--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Pre-processing during build

From
"Markus KARG"
Date:
Mark,

the chapters you describe list explicitly the exceptions that can happen, and the causes. None of them covers "a class
isreferenced that has a non-referenced method that has a non-existent class". 

Even with early resolution, the JLS does not cover the case we have, which is containing non-existent classes in
non-referencedmethods of referenced classes! 

The devil's in the details, right? ;-)

Hence we're still safe.

If that won't work, nobody could write a Java EE program and compile and test it agains the official javaee.jar, as
thatone not even contains ANY byte code but SOLELY declarations. And that one's an official JAR from the makers of
Java!

Regards
-Markus

-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Mark Rotteveel
Sent: Mittwoch, 17. Juni 2015 09:03
To: List
Subject: Re: [JDBC] Pre-processing during build

On Tue, 16 Jun 2015 21:26:12 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> So it boils down to verification and I doubt that the bytecode
> verifier will try to actually load java.sql.Type class. Have you
> really tried
this
> out or do you have another link where it is written that the byte code
> verifier will CHECK the existence of a parameter class when it
> verifies
the
> loaded class?

It looks like the JLS and JVM specification allow for resolution of symbolic references at classloading or "first use"
anda JVM implementation is free to choose: see 
http://docs.oracle.com/javase/specs/jls/se8/html/jls-12.html#jls-12.1.2 and
http://docs.oracle.com/javase/specs/jls/se8/html/jls-12.html#jls-12.3

Although I am not entirely sure about this, I interpret this that loading a class that has a method whose signature
includesa type that is not available (like SQLType on Java 7 or lower) could work on one JVM implementation (late
resolution),but not on others (early resolution). 

Mark



--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org) To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
"Markus KARG"
Date:
You got fooled by the fact that there is a difference between loading the very initial main class at JVM bootstrap, and
loadingANY OTHER subsequent class. This is difference is clearly documented in chapters JLS 12.1 (initial class) and
JLS12.2 (other classes). 

After correcting your example it works pretty well, see: https://gist.github.com/mkarg/88a89ae0dbffcfb7543e


-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Mark Rotteveel
Sent: Mittwoch, 17. Juni 2015 09:07
To: List
Subject: Re: [JDBC] Pre-processing during build

On Wed, 17 Jun 2015 00:02:40 +0300, Vladimir Sitnikov
<sitnikov.vladimir@gmail.com> wrote:
> 2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
>> The answer is pretty simple: Try it out. :-)
>>
>> Just compile a JRE 8 class down to byte code level 6 and load it on
Java
>> level 7. That's what I proposed. Nothing else. It really bet will work
>> unless you try to INSTANTIATE JRE-only classes, but it should LOAD. And
>> nothing more we need.
>
> Markus, can you please be more explicit in your suggestion?
>
> I did try a simple "Hello, world" and it does not run in stock JDKs of
> MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
> Even if you manage to make that fly, that would be built on sand.

Good example: it demonstrates at least that using reflection (eg
getDeclaredMethods; or in this case privateGetDeclaredMethods) will lead to
a NoClassDefFoundError.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
Stephen Nelson
Date:

The discussion has veered off topic somewhat with the discussion of build systems and their respective qualities.

My pull request is proposing to change build system from Ant to Maven. All existing functionality in the Ant build has been/will be replicated in the Maven build - including the pre-processing step. There are many advantages to using Maven as opposed to Ant. However, this is orthogonal to the discussion I'd like to have about the pre-processing used during the build process.

Focusing on the aim of the current pre-processing step - which is to select the appropriate spec implementation at compile time. The options seem quite complicated to me as a relative novice in the finer points of the language spec.

Taking a step back, could this not be achieved using annotations and a custom annotation processor to emit some code to select the correct implementation of the spec? This seems more Java-like and not a massive change from the existing code.

Re: Pre-processing during build

From
"Markus KARG"
Date:
Sorry but you're wrong here.

Vladimir's example was invalid. See https://gist.github.com/mkarg/88a89ae0dbffcfb7543e instead.

Resolution will not fail. Even early static resolution won't. Check again the chapter of JLS about the critera to throw
thelisted exceptions. None of them is met with the corrected example
https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.

The direct example will NOT fail on Java 7 as the method is not changed, but added. The original method is still in
place.Both methods exist at the same time and differ by signature, hence the linker of the old program finds the old
methodin the new class. No problem at all. 

The indirect example will NOT fail on Java 7 as a JRE 7 client will never pass in an instance of a JRE 8 type (how
shouldeven know of its existence?), and the Java 6 machine executing the invoked method will never INSTANTIATE that
typeso it will not fail. No problem at all. 

Still should work. You'd possibly like to set up a proof using the corrected example
https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.:-) 

Regards
-Markus

-----Original Message-----
From: Mark Rotteveel [mailto:mark@lawinegevaar.nl]
Sent: Mittwoch, 17. Juni 2015 09:18
To: Markus KARG
Subject: RE: [JDBC] Pre-processing during build

On Tue, 16 Jun 2015 21:35:33 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> On Mon, 15 Jun 2015 23:59:45 +0200, "Markus KARG"
<markus@headcrashing.eu>
> wrote:
>> For Java 6 and 7 it would be possible to use the same codebase (if you
> ignore certain types iirc), but for Java 8 you need a separate library
due
> to the introduction of the SQLType interface. And if you fully support
JDBC
> 4.2, you also need the classes from java.time.
>
> Actually I doubt that truely. The idea is NOT to support JDBC42 on Java
6.
> The idea is that Java 6 loads our JAR and calls JDBC4 methods. The fact
> that the same JAR also contains Java 8 types like java.time or the new
> SQLType is invisible to the JRE 6 client application, as that one is
> written against JDBC4, hence does neither INSTANTIATE that types NOR
> invokes the methods using them as parameters!

Vladimir already demonstrated one problem, at least when using reflection
(which is not atypical with a lot of tools that use JDBC and with
connections pools). And I already pointed at the resolution phase as
described in the JLS that might happen at classloading time.

>> JDBC APIs themselves are not backwards compatible (ie: they introduce
new
> types in the signature, or require you to handle new types in existing
> methods), the API is only backwards compatible from the perspective of
the
> user.
>
> I think that's pretty enough. Nobody forces is to use Oracle's JARs. The
> driver just has to fulil the API towards the client application. That's
why
> it's called an API (not SPI). Can you please make an example what API is
> incompatible between JDBC4 and JDBC42?

An example of direct API incompatibility, use of a type new in Java 8
(SQLType):
ResultSet.updateObject(int columnIndex, Object x, SQLType targetSqlType)
http://docs.oracle.com/javase/8/docs/api/java/sql/ResultSet.html#updateObject-int-java.lang.Object-java.sql.SQLType-

An example of indirect API incompatibility: requirement to support
java.time types (added in Java 8) in get/set/updateObject, although that
can be worked around.

Mark




Re: Pre-processing during build

From
"Markus KARG"
Date:

Even plain Java can do this, once the example is corrected: https://gist.github.com/mkarg/88a89ae0dbffcfb7543e

 

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Dave Cramer
Sent: Mittwoch, 17. Juni 2015 12:16
To: Mark Rotteveel
Cc: List
Subject: Re: [JDBC] Pre-processing during build

 

 

On 17 June 2015 at 03:07, Mark Rotteveel <mark@lawinegevaar.nl> wrote:

On Wed, 17 Jun 2015 00:02:40 +0300, Vladimir Sitnikov
<sitnikov.vladimir@gmail.com> wrote:
> 2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
>> The answer is pretty simple: Try it out. :-)
>>
>> Just compile a JRE 8 class down to byte code level 6 and load it on
Java
>> level 7. That's what I proposed. Nothing else. It really bet will work
>> unless you try to INSTANTIATE JRE-only classes, but it should LOAD. And
>> nothing more we need.
>
> Markus, can you please be more explicit in your suggestion?
>
> I did try a simple "Hello, world" and it does not run in stock JDKs of
> MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
> Even if you manage to make that fly, that would be built on sand.

Good example: it demonstrates at least that using reflection (eg
getDeclaredMethods; or in this case privateGetDeclaredMethods) will lead to
a NoClassDefFoundError.

 

I'm not sure this is a great example as Optional itself is a java 8 construct.

 

Either way Spring is able to do this, as are others?

 


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

 

Re: Pre-processing during build

From
"Markus KARG"
Date:

Can you proof your assumptions using the corrected example: https://gist.github.com/mkarg/88a89ae0dbffcfb7543e ?

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Sehrope Sarkuni
Sent: Mittwoch, 17. Juni 2015 13:05
To: Dave Cramer
Cc: Mark Rotteveel; List
Subject: Re: [JDBC] Pre-processing during build

 

On Wed, Jun 17, 2015 at 6:15 AM, Dave Cramer <pg@fastcrypt.com> wrote:

I'm not sure this is a great example as Optional itself is a java 8 construct.

 

Either way Spring is able to do this, as are others?

 

The approach used by Spring won't work for the JDBC driver. The crux of the issue is that the newest version of the JDBC spec include Java 8 types in method signatures of public interfaces that exist in Java . Spring doesn't do that.

 

The public interfaces and classes for the older JDK versions they support (i.e. 6 or 7) only expose types that exist in those JDK versions. For older classes they've added internal support for Java 8 types that is dynamically checked, but it's done by wrapping the integration in an inner class. Here's an example: https://github.com/spring-projects/spring-framework/blob/f41de12cf62aebf1be9b30be590c12eb2c030853/spring-beans/src/main/java/org/springframework/beans/AbstractNestablePropertyAccessor.java#L1041

 

There's no way to make that work when a public interface exposes classes that won't exist on the run time. It may have been possible with older upgrades to the JDBC spec (ex: 4 to 4.1) as there weren't any JDK 1.7-only classes used in methods signatures of existing public interfaces. Compiling with an older bytecode target would allow and older JDK to simply ignore those methods as they would not be part of the public signature.

 

In JDBC 4.2 that's not true though. For example the JDBC 4.2 PreparedStatement class has a new setObject(...) that uses a Java 8 only class:

 

 

 

That method signature can't appear in a driver that is going to be used in JDK 6 or 7. There's no way to hide it internally as it's part of the public signature.

 

We're going to need some kind of preprocessing step to handle things like this.

 

Regards,

-- Sehrope Sarkuni

Founder & CEO | JackDB, Inc. | https://www.jackdb.com/

 

Re: Pre-processing during build

From
"Markus KARG"
Date:
On Wed, 17 Jun 2015 13:52:52 +0200, Mark Rotteveel <mark@lawinegevaar.nl>
wrote:
> That is almost what PostgreSQL uses now, but this is not going to work
if
> you compile with Java 8 and assume that
> org.postgresql.jdbc4.PreparedStatement would then also be new-able when
the
> same jar is used under Java 6 or 7, because
> org.postgresql.jdbc4.PreparedStatement would need to be abstract at
compile
> time as it doesn't contain the methods required by the Java 8 (JDBC 4.2)
> API during compilation.
>
> So getting this to work would need some form of reflection (to get the
> right type at runtime based on the Java version), some preprocessing (as
> done currently) to get around the compilation problem or some byte code
> generation/modification to "unabstract"
> org.postgresql.jdbc4.PreparedStatement after compilation, or some form
of
> tiered compilation (where the org.postgresql.jdbc4.PreparedStatement is
> compiled with Java 7, and org.postgresql.jdbc42.PreparedStatement with
Java
> 8; this might be more complex than the existing solution.

>I just realized it might actually work: some (maybe all) methods added in
the JDBC API for Java 8 were added as default interface methods (with an
implementation that throws UnsupportedOperationException), so compilation
would succeed for org.postgresql.jdbc4.PreparedStatement without having an
implementation for the new methods.

Yay, seems you finally noticed we guys at the Java EE EGs are not so dumb as people might think! Honestly, what do you
thinkwhat we actually invented default methods for if not for this case...? ;-) 

>You'd still need reflection or an other trick to decide based on the Java
version which classes (Statement, PreparedStatement, ResultSet, etc) to
instantiate.

Only in case of one single driver name and not wanting to provide a property telling the JDBC version. Also not in case
ofseparate driver names like "Jdbc3Driver". In fact I do not see any difference in providing a list of separate
downloadscompared to a list of separate driver names. 




Re: Pre-processing during build

From
"Markus KARG"
Date:
I think you not even need reflection, but could simply rely on proxies. Should be faster and do the job faster.
http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/reflect/Proxy.html:-) 

An official Java SE standard since v1.3, BTW, to prevent further discussions about versions. ;-)

-Markus


-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Vladimir Sitnikov
Sent: Mittwoch, 17. Juni 2015 14:04
To: Mark Rotteveel
Cc: List
Subject: Re: [JDBC] Pre-processing during build

>So getting this to work would need some form of reflection (to get the
right type at runtime based on the Java version)

Reflection is sufficient, isn't it?

I think reflection would be rather clean solution here.
It is more IDE-friendly and developer-friendly than pre-processing.
Reflection is much easier to debug than byte-code generation.

So, do you see drawbacks with using reflection to select the specific
implementation?

From implementation point of view it just replaces "preprocessing"
with some "Class.forName" and that is it.
It even allows to ship the same jar file and select implementation on the fly.

Vladimir


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
"Markus KARG"
Date:

+1 good posting

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Christopher BROWN
Sent: Mittwoch, 17. Juni 2015 14:24
To: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Pre-processing during build

 

As has been said by Markus KARG and others, you CAN produce a driver using bytecode for Java-6 / JDBC-4.1 including Java-8 / JDBC-4.2 (and Java-7 / JDBC 4.1) types and method signatures.  I've used this technique in many production applications, and have tried out this specific case just now, as a check.

 

So, for example, if you use "javac" from Java-8 and even include the Java-8 (JDBC-4.2) definition of PreparedStatement, compiling with source=1.6 and target=1.6 options, your JDBC-4.2 implementation of PreparedStatement WILL load into Java-6 with JDBC-4.  You can even annotate your implementations with @Override, no problem.  You will NOT have a problem with clients that expect a Java-6 / JDBC-4 API because there is no way you can compile such a client to invoke a JDBC-4.2 method (it would be a compiler error).  Java only tries to resolve classes on-demand, that is when it runs a code branch in a method body that refers to a type or invokes a method with such a type as part of its signature, and NOT when loading or instantiating your class.  If you never call it, you'll never have a problem.

 

You WILL have problems however in the following (avoidable) cases:

 

- if your implementation of a JDBC-4 driver calls code that in turn refers to types, fields, or methods that depend on a more recent method of the Java API, for example :

  - static initialization

  - constructor calls to code that depends on a more recent API version

  - an implementation of a JDBC-4 method that calls a JDBC-4.1 or -4.2 method (typically method overloading with the noble intention of avoiding copying-and-pasting code)

  - as has been suggested, the safest workaround is to just use "extends" where appropriate, instance of generating code from templates

- use of reflection (or proxies) to examine classes or invoke methods

- use of BeansIntrospector

 

The problem is not in compiling, it's about ensuring that once client code invokes a JDBC-4 method, that your implementation of that method doesn't call in turn any code that it shouldn't.

 

Code coverage metrics are an additional guarantee but you'd have to be very sure you've got correct coverage for all version-dependant code paths.  Compiler constraints are probably safer ; I'll discuss that in a moment.

 

First, a few remarks concerning some of the previous posts :

 

https://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#setObject-int-java.lang.Object-java.sql.SQLType- is actually implemented as a Java-8 "default" method.  You don't need to implement it directly in the driver.

 

https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213 is a doomed-to-fail example, not due to bytecode versions but because Java uses reflection (see above list of problems) to find your "main" method, and so trips up on the method using Java-8 types.  Restructured as follows (two classes with separate source files), it works:

 

----8<---- Jre7Test.java ----8<----

 

public class Jre7Test {

    public static void main(String args[]) {

        System.out.println(Jre7TestCompanion.greeting());

    }

}

 

----8<---- Jre7TestCompanion.java ----8<----

 

import java.time.Duration;

import java.util.Optional;

 

public class Jre7TestCompanion {

    public static Optional<Duration> optional(java.time.Duration duration) {

        return Optional.of(duration);

    }

 

    public static String greeting() {

        return "Hello, world";

    }

}

 

----8<--------8<----

 

(the above compiled and run with the exact same commands on Mac OS X too).

 

The safest way is to use incremental compilation (all integrated into a single automated build, with no preference for build tool).  Using fictional package and class names to demonstrate the idea, here's how it could be done.

 

For example, produce an intermediate "pgjdbc_4.0.jar" using a "JDBC-4" package (1.6 API as bootstrap classpath for "javac", with 1.8 compiler and source/target 1.6):

 

jdbc_4_0.PGDriver_4_0

jdbc_4_0.PGPreparedStatement_4_0

jdbc_4_0.PGResultSet_4_0

...etc

 

Add the resulting "jar" to the classpath for the next step, with classes that extend the above, producing "pgjdbc_4.1.jar" (useless unless "pgjdbc_4.0.jar" is also in the classpath).  (1.7 API as bootstrap classpath for "javac", with 1.8 compiler and source/target 1.6)

 

jdbc_4_1.PGDriver_4_1 extends jdbc_4_0.PGDriver_4_0

jdbc_4_1.PGPreparedStatement_4_1 extends jdbc_4_0.PGPreparedStatement_4_0

jdbc_4_1.PGResultSet_4_1 extends jdbc_4_0.PGResultSet_4_0

...etc

 

Then add the both resulting "jar" to the classpath for the next step, with classes that extend the above, producing "pgjdbc_4.2.jar" (useless unless "pgjdbc_4.0.jar" is also in the classpath, along with "pgjdbc_4.1.jar"). (1.8 API as bootstrap classpath for "javac", with 1.8 compiler and source/target 1.6)

 

jdbc_4_2.PGDriver_4_2 extends jdbc_4_1.PGDriver_4_1

jdbc_4_2.PGPreparedStatement_4_2 extends jdbc_4_1.PGPreparedStatement_4_1

jdbc_4_2.PGResultSet_4_2 extends jdbc_4_1.PGResultSet_4_1

...etc

 

Then, merge all JARs into a single JAR.  Clients could then refer to the specific driver version they require in code, or use a generic Driver class that (in the constructor) detects the appropriate JDBC version and fixes a "final" int or Enum field, used thereafter in "switch" blocks to call the appropriate driver version, acting as a lightweight proxy when the specific driver version can't be referred to (for backwards compatibility).  More adventurous developers might even suggest usage of method handles from Java 7 onwards to eliminate the negligeable overhead of a switch statements, but I'd personally rely on the JVM to optimise that away. Note that this is only necessary for the Driver implementation, as no-one (apart from the driver implementors) should ever call "new PreparedStatement" or whatever.

 

Hope that helps ; hope it's not redundant with regards to messages sent since I started typing away my 2 cents...  In any case, I regularly use these techniques in production code with no accidents.

 

--
Christopher

 

 

 

On 17 June 2015 at 13:05, Sehrope Sarkuni <sehrope@jackdb.com> wrote:

On Wed, Jun 17, 2015 at 6:15 AM, Dave Cramer <pg@fastcrypt.com> wrote:

I'm not sure this is a great example as Optional itself is a java 8 construct.

 

Either way Spring is able to do this, as are others?

 

The approach used by Spring won't work for the JDBC driver. The crux of the issue is that the newest version of the JDBC spec include Java 8 types in method signatures of public interfaces that exist in Java . Spring doesn't do that.

 

The public interfaces and classes for the older JDK versions they support (i.e. 6 or 7) only expose types that exist in those JDK versions. For older classes they've added internal support for Java 8 types that is dynamically checked, but it's done by wrapping the integration in an inner class. Here's an example: https://github.com/spring-projects/spring-framework/blob/f41de12cf62aebf1be9b30be590c12eb2c030853/spring-beans/src/main/java/org/springframework/beans/AbstractNestablePropertyAccessor.java#L1041

 

There's no way to make that work when a public interface exposes classes that won't exist on the run time. It may have been possible with older upgrades to the JDBC spec (ex: 4 to 4.1) as there weren't any JDK 1.7-only classes used in methods signatures of existing public interfaces. Compiling with an older bytecode target would allow and older JDK to simply ignore those methods as they would not be part of the public signature.

 

In JDBC 4.2 that's not true though. For example the JDBC 4.2 PreparedStatement class has a new setObject(...) that uses a Java 8 only class:

 

 

 

That method signature can't appear in a driver that is going to be used in JDK 6 or 7. There's no way to hide it internally as it's part of the public signature.

 

We're going to need some kind of preprocessing step to handle things like this.

 

Regards,

-- Sehrope Sarkuni

Founder & CEO | JackDB, Inc. | https://www.jackdb.com/

 

 

Re: Pre-processing during build

From
"Markus KARG"
Date:

Stephen,

 

don't fear the complexity, it will soon be shown that it is pretty simple once everybody understood that https://gist.github.com/mkarg/88a89ae0dbffcfb7543e solves most of the fear and that proxies will select the right JDBC level. In the end, the solution will be dead simple and rock solid, I hope. Let's just finish the discussion.

 

I do not see that an annotation processor will improve anything here, and the code change actually is not massive at all, but a necessity to come to a feasible design.

 

-Markus

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Stephen Nelson
Sent: Mittwoch, 17. Juni 2015 22:43
To: PostgreSQL JDBC
Subject: Re: [JDBC] Pre-processing during build

 

The discussion has veered off topic somewhat with the discussion of build systems and their respective qualities.

My pull request is proposing to change build system from Ant to Maven. All existing functionality in the Ant build has been/will be replicated in the Maven build - including the pre-processing step. There are many advantages to using Maven as opposed to Ant. However, this is orthogonal to the discussion I'd like to have about the pre-processing used during the build process.

Focusing on the aim of the current pre-processing step - which is to select the appropriate spec implementation at compile time. The options seem quite complicated to me as a relative novice in the finer points of the language spec.

Taking a step back, could this not be achieved using annotations and a custom annotation processor to emit some code to select the correct implementation of the spec? This seems more Java-like and not a massive change from the existing code.

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>This is difference is clearly documented in chapters JLS 12.1 (initial class) and JLS 12.2 (other classes).

There is no difference.

12.1.1 clearly reads: "This loading process is described further in §12.2."

Just in case, I've updated my gist:
https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213, so you can see it
is reflection that makes the thing fail.

I thought it was obvious from the start: reflection has to represent
method arguments as Class somehow, so it fails as anyone would expect.

Vladimir


Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
Stephen>However, this is orthogonal to the discussion I'd like to have
about the pre-processing used during the build process.

If I understand well, we all agree that pre-processing should be
dodged in the future.

I think it might be easier (from development and testing points of
view) to do the following:
1) remove source pre-processing in the current Ant build. That should
be rather trivial.
2) mavenize

That would eliminate "complex preprocessing" in the maven
configuration. Even if that processing is "easy" it would still take
time to review and test, and we would drop that pre-processing anyway.

Any objections?

PS. I could file a PR for removing pre-processing, however I am afraid
you'll kill me for that many PRs to review.
PPS. I've added my review comments to 3 of 4 non-mine PRs.

Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:
Wrong. 12.1.1 contains more restriction on the initial class than 12.2 which describes the general process for ANY
class.Hence it is a difference. That is what you see in the difference of you initial example to mine on Gist. :-) 

Sorry I was not aware that you only posted the example to demonstrate REFLECTION to fail. Maybe I missed that. I always
talkedabout normal instantiation, not about reflection. Also AFAIK JDBC spec does not say reflection MUST be possible,
doesit? 

-----Original Message-----
From: Vladimir Sitnikov [mailto:sitnikov.vladimir@gmail.com]
Sent: Mittwoch, 17. Juni 2015 23:15
To: Markus KARG
Cc: List
Subject: Re: [JDBC] Pre-processing during build

>This is difference is clearly documented in chapters JLS 12.1 (initial class) and JLS 12.2 (other classes).

There is no difference.

12.1.1 clearly reads: "This loading process is described further in §12.2."

Just in case, I've updated my gist:
https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213, so you can see it is reflection that makes the thing fail.

I thought it was obvious from the start: reflection has to represent method arguments as Class somehow, so it fails as
anyonewould expect. 

Vladimir



Re: Pre-processing during build

From
"Markus KARG"
Date:
Well, there IS a difference: We can pack our own copy of java.sql.Type with our Driver, but we should not dare to do
thatwith non-java.sql.* types. Hence, if that is the only package, we're safe. No problem. 

This is a typical pattern used by several Java EE specifications. For example, JPA. If the JRE finds the class on the
bootstrapclasspath, it uses it. If not, it's using the one on the application classpath. Finely documented by the JVM
specification,BTW, and performed daily by Java EE specs. 

-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Mark Rotteveel
Sent: Mittwoch, 17. Juni 2015 13:13
To: List
Subject: Re: [JDBC] Pre-processing during build

On Wed, 17 Jun 2015 06:15:47 -0400, Dave Cramer <pg@fastcrypt.com> wrote:
> On 17 June 2015 at 03:07, Mark Rotteveel <mark@lawinegevaar.nl> wrote:
>
>> On Wed, 17 Jun 2015 00:02:40 +0300, Vladimir Sitnikov
>> <sitnikov.vladimir@gmail.com> wrote:
>> > 2015-06-16 22:30 GMT+03:00 Markus KARG <markus@headcrashing.eu>:
>> >> The answer is pretty simple: Try it out. :-)
>> >>
>> >> Just compile a JRE 8 class down to byte code level 6 and load it on
>> Java
>> >> level 7. That's what I proposed. Nothing else. It really bet will
work
>> >> unless you try to INSTANTIATE JRE-only classes, but it should LOAD.
>> >> And
>> >> nothing more we need.
>> >
>> > Markus, can you please be more explicit in your suggestion?
>> >
>> > I did try a simple "Hello, world" and it does not run in stock JDKs
of
>> > MacOS: https://gist.github.com/vlsi/aeeb4a61d9c2b67ad213
>> > Even if you manage to make that fly, that would be built on sand.
>>
>> Good example: it demonstrates at least that using reflection (eg
>> getDeclaredMethods; or in this case privateGetDeclaredMethods) will
lead
>> to
>> a NoClassDefFoundError.
>>
>>
> I'm not sure this is a great example as Optional itself is a java 8
> construct.

Yes, and so is java.sql.SQLType. So if this doesn't work for Optional, it
also won't work for SQLType.

> Either way Spring is able to do this, as are others?

Spring uses a lot of reflection, proxies, byte code generation, etc to get
things done. I am not sure if you want to go that way.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
> Also AFAIK JDBC spec does not say reflection MUST be possible, does it?

Does the spec say anything on reflection might not being available?

I can easily imagine application that uses reflection, cglib, etc to
call/manage/pool connections, statements.
Can you list a couple of libraries that forbid reflection usage?

I think it is uncommon in java when a class that implements some
public API fails to "getMethods".

> Wrong. 12.1.1 contains more restriction on the initial class

Can you please cite that "restriction"?

> That is what you see in the difference of you initial example to mine on Gist. :-)

False. It was obvious from the start (see stacktrace) that it was
java.lang.Class.privateGetDeclaredMethods that failed.
If the restriction exists indeed, another example is required to
highlight that. My examples are doomed to die of reflection.

Vladimir


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 22:09:09 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> Mark,
>
> the chapters you describe list explicitly the exceptions that can
happen,
> and the causes. None of them covers "a class is referenced that has a
> non-referenced method that has a non-existent class".

I think you fail to see the meaning of "resolution of symbolic references"
and the implications of JLS 12.3. A referenced class (be it in a method
signature or body) in the java code, has a symbolic reference (ie: a String
with type information, for simplicity: the fully qualified name) in the
bytecode. A JVM implementation can choose to resolve those to actual
classes at classloading time, or when it really needs it (eg if the method
is called). Depending on the specific JVM implementation, this means that a
class with a method that has a type in its signature that is not available
at runtime might either fail at classloading time or at runtime when - as
demonstrated by Vladimir - you enumerate the methods of the class using
reflection.

> Even with early resolution, the JLS does not cover the case we have,
which
> is containing non-existent classes in non-referenced methods of
referenced
> classes!

Yes it does.

> The devil's in the details, right? ;-)
>
> Hence we're still safe.
>
> If that won't work, nobody could write a Java EE program and compile and
> test it agains the official javaee.jar, as that one not even contains
ANY
> byte code but SOLELY declarations. And that one's an official JAR from
the
> makers of Java!

Sorry, but that doesn't make any sense at all. The "official" JavaEE jar
does contain bytecode: interfaces and some supporting classes like
exceptions.

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 22:47:23 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> Sorry but you're wrong here.
>
> Vladimir's example was invalid. See
> https://gist.github.com/mkarg/88a89ae0dbffcfb7543e instead.
>
> Resolution will not fail. Even early static resolution won't. Check
again
> the chapter of JLS about the critera to throw the listed exceptions.
None
> of them is met with the corrected example
> https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.
>
> The direct example will NOT fail on Java 7 as the method is not changed,
> but added. The original method is still in place. Both methods exist at
the
> same time and differ by signature, hence the linker of the old program
> finds the old method in the new class. No problem at all.
>
> The indirect example will NOT fail on Java 7 as a JRE 7 client will
never
> pass in an instance of a JRE 8 type (how should even know of its
> existence?), and the Java 6 machine executing the invoked method will
never
> INSTANTIATE that type so it will not fail. No problem at all.
>
> Still should work. You'd possibly like to set up a proof using the
> corrected example https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.
:-)

Sorry, but you're the one that is wrong, it is not only about actually
calling methods with types, it is about the presence or absence of those
types when the JVM does decide to resolve the symbolic reference (eg when
you reflect the declared methods). I am about done with this discussion. I
think the onus is on you to prove this scheme will work, not for us to
prove it won't work (which we already did). Your prove should not only
include simple direct instance access, but also when using reflection
**which is very common with JDBC drivers** (eg connection pools,
tools/libraries that bridge differences in JDBC implementations, etc).

It sounds like you want to trade minor complexity in the build/IDE process
for a world of hurt for the users of your driver. I don't think that is a
good way forward.

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 22:59:18 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
>>I just realized it might actually work: some (maybe all) methods added
in
> the JDBC API for Java 8 were added as default interface methods (with an
> implementation that throws UnsupportedOperationException), so
compilation
> would succeed for org.postgresql.jdbc4.PreparedStatement without having
an
> implementation for the new methods.
>
> Yay, seems you finally noticed we guys at the Java EE EGs are not so
dumb
> as people might think! Honestly, what do you think what we actually
> invented default methods for if not for this case...? ;-)

It still doesn't solve the problem if you actually have an implementation
of that method in your class, which was the starting point of the
discussion as I saw it.

>>You'd still need reflection or an other trick to decide based on the
Java
> version which classes (Statement, PreparedStatement, ResultSet, etc) to
> instantiate.
>
> Only in case of one single driver name and not wanting to provide a
> property telling the JDBC version. Also not in case of separate driver
> names like "Jdbc3Driver". In fact I do not see any difference in
providing
> a list of separate downloads compared to a list of separate driver
names.

It is a problem (although solvable), because JDBC 4 driver loading
requires you to provide a META-INF/services/java.sql.Driver file that
declares the driver(s) provided by the jar. This means that the driver it
declares must be loadable in all JVMs. This means that you either need to
provide a single driver that handles the differences, or you need to have
multiple drivers and ensure that the JDBC4 driver refuses to create
connections if the JDBC 4.1 or 4.2 driver is also loaded.

It will also make things harder for tools that want to load your driver
directly (instead of going through DriverManager).

Mark




Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 23:02:44 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> I think you not even need reflection, but could simply rely on proxies.
> Should be faster and do the job faster.
>
http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/reflect/Proxy.html
> :-)
>
> An official Java SE standard since v1.3, BTW, to prevent further
> discussions about versions. ;-)

Proxies use reflection (which is why they are part of the
java.lang.reflect package.

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Wed, 17 Jun 2015 23:51:26 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> Wrong. 12.1.1 contains more restriction on the initial class than 12.2
> which describes the general process for ANY class. Hence it is a
> difference. That is what you see in the difference of you initial
example
> to mine on Gist. :-)
>
> Sorry I was not aware that you only posted the example to demonstrate
> REFLECTION to fail. Maybe I missed that. I always talked about normal
> instantiation, not about reflection. Also AFAIK JDBC spec does not say
> reflection MUST be possible, does it?

Of course you need to support reflection, in some cases the JVM itself
will use reflection and as I said in an earlier email: a lot of tools and
libraries depend on it. Not supporting reflection will simply make the
driver unusable in a lot of situations.

I can't believe I actually need to argue this point.

Mark


Re: Pre-processing during build

From
Andrej Golovnin
Date:
Hi Mark,

>>
>> An official Java SE standard since v1.3, BTW, to prevent further
>> discussions about versions. ;-)
>
> Proxies use reflection (which is why they are part of the
> java.lang.reflect package.
>

Just because the class is in the package java.lang.reflect, it does
not mean it uses the reflection. Proxies are on the fly generated
classes and use generic byte code to invoke methods. No reflection is
involved. Just take a look at the class sun.misc.ProxyClassFactory.

Best regards,
Andrej Golovnin


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Thu, 18 Jun 2015 08:58:14 +0200, Andrej Golovnin
<andrej.golovnin@gmail.com> wrote:
> Hi Mark,
>
>>>
>>> An official Java SE standard since v1.3, BTW, to prevent further
>>> discussions about versions. ;-)
>>
>> Proxies use reflection (which is why they are part of the
>> java.lang.reflect package.
>>
>
> Just because the class is in the package java.lang.reflect, it does
> not mean it uses the reflection. Proxies are on the fly generated
> classes and use generic byte code to invoke methods. No reflection is
> involved. Just take a look at the class sun.misc.ProxyClassFactory.

Instead I looked at java.lang.reflect.Proxy.ProxyClassFactory and it
certainly does use reflection (although on interfaces), and even if the
proxy generation itself wouldn't use reflection, most InvocationHandler
implementations I have seen do use reflection one way or the other.

Mark


Re: Pre-processing during build

From
Christopher BROWN
Date:
The entry point into a JDBC driver is, unsurprisingly, its implementation of java.sql.Driver (and javax.sql.DataSource if you want).  These are very straightforward and stable interfaces (there's only a new "getParentLogger()" method in 1.7), and they act as factories providing access to Connection objects.  Implementations of Driver, given that they are just factories for Connection objects, should never be on a critical performance path.

I see nothing wrong with implementing Driver using java.reflect.Proxy, and if we want to be paranoid about classloading, using reflection (within the implementation of the Proxy-based Driver) to instantiate and invoke methods on a version-specific implementation of Driver.  That way, even when using reflection, discovering an "unresolvable future type" is just impossible.  The Proxy-based Driver would delegate to some PGDriver6 / PGDriver7 / PGDriver8 which could be compiled in steps (as I described in an earlier message on this thread) and not to some PGSuperDriver, where PGDriver8 extends PGDriver7, and PGDriver7 extends PGDriver6, and where PGDriver8 returns (from the "connect" method) a PGConnection8 (extending PGConnection7), PGDriver7 returns PGConnection7, and so on.  So, even using reflection, it would be impossible to work with mismatched versions; the only exception being (for example in Java6 code) doing an explicit Class.forName("PGDriver8") but that sort of code just isn't possible except when the client is incompetent or malicious.  And, as I suggested in my previous message, it requires no code generation, just an extra compilation step per supported version.

As a side note, about the discussion on interpretations of the JLS, I've not yet encountered any JVM that attempts to fully-resolve all signatures as soon as it loads a class.  Does such a JVM exist?  I would assume that it would have poor performance for classloading (and a high memory overhead for class metadata, such as permgen/metaspace usage) because it would need to be recursive (for each type discovered in a method signature or body, it would need to load that type and all of its referenced types, recursively).  I'd be very interested to know how it deals with circular references (for example A refers to B and vice-versa ; it can't resolve A before B, and vice-versa), such as Object.toString() because String can't be loaded before Object, but Object would need to resolve String.

Getting back to the original discussion, I'm neutral on the Ant vs Maven debate, but it I'm sure that driver developers would benefit (readability and reliability) from plain old source code, simple class extending and step-by-step compilation to handle API evolution without trying relying on JVM behavior to have an all-in-one single implementation that may contain more than is advertised by specific interface versions.  This approach should also keep the build process deterministic and relatively straightforward and be able to produce a single driver artifact compatible with all JDBC versions, with no burden on driver users.

Hope that helps,
Christopher


On 18 June 2015 at 08:31, Mark Rotteveel <mark@lawinegevaar.nl> wrote:
On Wed, 17 Jun 2015 22:47:23 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> Sorry but you're wrong here.
>
> Vladimir's example was invalid. See
> https://gist.github.com/mkarg/88a89ae0dbffcfb7543e instead.
>
> Resolution will not fail. Even early static resolution won't. Check
again
> the chapter of JLS about the critera to throw the listed exceptions.
None
> of them is met with the corrected example
> https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.
>
> The direct example will NOT fail on Java 7 as the method is not changed,
> but added. The original method is still in place. Both methods exist at
the
> same time and differ by signature, hence the linker of the old program
> finds the old method in the new class. No problem at all.
>
> The indirect example will NOT fail on Java 7 as a JRE 7 client will
never
> pass in an instance of a JRE 8 type (how should even know of its
> existence?), and the Java 6 machine executing the invoked method will
never
> INSTANTIATE that type so it will not fail. No problem at all.
>
> Still should work. You'd possibly like to set up a proof using the
> corrected example https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.
:-)

Sorry, but you're the one that is wrong, it is not only about actually
calling methods with types, it is about the presence or absence of those
types when the JVM does decide to resolve the symbolic reference (eg when
you reflect the declared methods). I am about done with this discussion. I
think the onus is on you to prove this scheme will work, not for us to
prove it won't work (which we already did). Your prove should not only
include simple direct instance access, but also when using reflection
**which is very common with JDBC drivers** (eg connection pools,
tools/libraries that bridge differences in JDBC implementations, etc).

It sounds like you want to trade minor complexity in the build/IDE process
for a world of hurt for the users of your driver. I don't think that is a
good way forward.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>I see nothing wrong with implementing Driver using java.reflect.Proxy,

+1.

>PGDriver6 / PGDriver7 / PGDriver8 which could be compiled in steps

We can just compile all the drivers using JDK8 and -source 1.6 -target
1.6, can't we?
Well, step-by-step might be more robust, however it would require
careful setup of development environment (developers would have to
install different JDK versions and that is a high entry bar).

>I've not yet encountered any JVM that attempts to fully-resolve all signatures as soon as it loads a class.  Does such
aJVM exist? 

JLS allows JVM to load classes at any point in time, however if JVM
decides to do early loading, it should not throw exceptions before
application code indeed tries to touch the "bad method".

http://www.excelsiorjet.com/ is a JVM that compiles java down to
native code. I guess they do a lot of resolutions at compile phase
(much more than OpenJDK does).

Here's the relevant JLS section:
https://docs.oracle.com/javase/specs/jls/se7/html/jls-12.html#jls-12.2.1
"... however, to reflect loading errors only at points in the program
where they could have arisen without prefetching or group loading"

Vladimir


Re: Pre-processing during build

From
Christopher BROWN
Date:
On 18 June 2015 at 14:38, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
>I see nothing wrong with implementing Driver using java.reflect.Proxy,

+1.

>PGDriver6 / PGDriver7 / PGDriver8 which could be compiled in steps

We can just compile all the drivers using JDK8 and -source 1.6 -target
1.6, can't we?
Well, step-by-step might be more robust, however it would require
careful setup of development environment (developers would have to
install different JDK versions and that is a high entry bar).

If you don't change the bootstrap classpath per driver, and you use the same JDK (8), then even with source/target set to 1.6, "javac" will complain that (for your pre-Java-8 drivers) you haven't implemented everything.  It does require installing versions of JDK 6, 7, and 8 and -- at least in Ant, haven't tried in Maven or Gradle -- you can define paths as build properties.  Furthermore, even if contributors find a simple workaround, the release manager would be able to be more confident about use (within implementations) of other JDK classes that may not be available on all target versions if at least s/he does have such a setup (read : it's not the end of the world if a contributor makes a mistake with the classpath, as long as any such mistakes are caught before a release is made).

Java 9 will use modules instead of the monolithic "rt.jar" so this is something that will need to be watched out for.

 

>I've not yet encountered any JVM that attempts to fully-resolve all signatures as soon as it loads a class.  Does such a JVM exist?

JLS allows JVM to load classes at any point in time, however if JVM
decides to do early loading, it should not throw exceptions before
application code indeed tries to touch the "bad method".

http://www.excelsiorjet.com/ is a JVM that compiles java down to
native code. I guess they do a lot of resolutions at compile phase
(much more than OpenJDK does).

Here's the relevant JLS section:
https://docs.oracle.com/javase/specs/jls/se7/html/jls-12.html#jls-12.2.1
"... however, to reflect loading errors only at points in the program
where they could have arisen without prefetching or group loading"

Vladimir

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
> even with source/target set to 1.6, "javac" will complain that (for your pre-Java-8 drivers) you haven't implemented
everything

I see.
This indeed makes "step-by-step" compilation a good solution.

We might want to drop JDK6 support to make it easier for us and for
the contributors (1 JDK less to install).

Vladimir


Re: Pre-processing during build

From
Christopher BROWN
Date:
If you decided to drop JDK6 support going forward, that wouldn't cause the current JDK6-compatible PostgreSQL driver versions to be deleted, so no-one would be stuck.  Even if new features are added to PostgreSQL (the database or the driver), the JDK6 JDBC API will remain constant and won't have any more or less features than before; there's probably enough expressivity in SQL and driver-specific classes to do pretty much anything anyway. 

These are just pros and cons to consider if you're thinking about this (I'm not recommending anything either way, especially as this tends to start flame wars).  Personally, I wouldn't be shocked by such a choice given that PostgreSQL and the driver are open-source and that JDK6 has been EOL for a long time, and that as such, I understand there are limited resources for supporting all possible configurations.

Anyway, like I said, even if there was a 9.5 or 9.6 version that was JDK7+, JDK6 users will still be able to use the 9.4 driver.



On 18 June 2015 at 15:52, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
> even with source/target set to 1.6, "javac" will complain that (for your pre-Java-8 drivers) you haven't implemented everything

I see.
This indeed makes "step-by-step" compilation a good solution.

We might want to drop JDK6 support to make it easier for us and for
the contributors (1 JDK less to install).

Vladimir

Re: Pre-processing during build

From
Dave Cramer
Date:


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 10:56, Christopher BROWN <brown@reflexe.fr> wrote:
If you decided to drop JDK6 support going forward, that wouldn't cause the current JDK6-compatible PostgreSQL driver versions to be deleted, so no-one would be stuck.  Even if new features are added to PostgreSQL (the database or the driver), the JDK6 JDBC API will remain constant and won't have any more or less features than before; there's probably enough expressivity in SQL and driver-specific classes to do pretty much anything anyway. 

These are just pros and cons to consider if you're thinking about this (I'm not recommending anything either way, especially as this tends to start flame wars).  Personally, I wouldn't be shocked by such a choice given that PostgreSQL and the driver are open-source and that JDK6 has been EOL for a long time, and that as such, I understand there are limited resources for supporting all possible configurations.

Anyway, like I said, even if there was a 9.5 or 9.6 version that was JDK7+, JDK6 users will still be able to use the 9.4 driver.



On 18 June 2015 at 15:52, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
> even with source/target set to 1.6, "javac" will complain that (for your pre-Java-8 drivers) you haven't implemented everything

I see.
This indeed makes "step-by-step" compilation a good solution.

We might want to drop JDK6 support to make it easier for us and for
the contributors (1 JDK less to install).

Vladimir


Removing JDK 1.6 is not being contemplated at the moment. While many "hackers" are keen to use the latest greatest features of Java N+1, there is a VERY large population that still runs older JVM's for whatever reason. As I mentioned earlier in this thread the focus of the driver is to provide access to the all supported versions of PostgreSQL( and even older when possible ) to Java users, not to provide the latest Java tools for PostgreSQL. 

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>Removing JDK 1.6 is not being contemplated at the moment

When do you think it is safe to remove JDK 1.6 support?

Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:
If you ever had a look at the Java EE specs for example, they always EXPLICITLY list the REQUIRED the APIs (not the
NON-REQUIREDones). It's never the case that the specs are listing dependencies in a negative way, but always in a
positiveway. Just like Maven would do, too, and just like there is no "excludes" clause restricting "import java.*" in
theJava language. :-) 

Anyways, it seems to be common sense among us that reflection SHOULD work, so my solution is not the right way to go.

Regarding your questions and just for couriosity, even being irrelevant due to the reflection topic meanwhile: My
examplealso fails even without (!) reflection once you remove the inner class and simply have the Java 8 method
implementeddirectly by the main class. Hence it proofs the special restriction for main classes. That restriction is
thedifference between 12.1 (= how an initial class is loaded) and 12.2 (= how other classes are loaded). It is an
optionalimplementation choice of the Oracle JVM to do so: It demands that ALL references from main class to other
classesMUST have existing byte code. This demand does not exists for classes NOT directly references but loaded lazily
ata later time (following 12.2 rules ONLY but not 12.1 rules). Just change my example and you'll see the effect - it
failsimmediately as soon as there is no wrapping inner class anymore decoupling the missing classes from the main
class.

-----Original Message-----
From: Vladimir Sitnikov [mailto:sitnikov.vladimir@gmail.com]
Sent: Donnerstag, 18. Juni 2015 00:07
To: Markus KARG
Cc: List
Subject: Re: [JDBC] Pre-processing during build

> Also AFAIK JDBC spec does not say reflection MUST be possible, does it?

Does the spec say anything on reflection might not being available?

I can easily imagine application that uses reflection, cglib, etc to
call/manage/pool connections, statements.
Can you list a couple of libraries that forbid reflection usage?

I think it is uncommon in java when a class that implements some
public API fails to "getMethods".

> Wrong. 12.1.1 contains more restriction on the initial class

Can you please cite that "restriction"?

> That is what you see in the difference of you initial example to mine on Gist. :-)

False. It was obvious from the start (see stacktrace) that it was
java.lang.Class.privateGetDeclaredMethods that failed.
If the restriction exists indeed, another example is required to
highlight that. My examples are doomed to die of reflection.

Vladimir



Re: Pre-processing during build

From
Dave Cramer
Date:
Vladimir,

To be honest I don't really know. The difficult part about this is that we could put up a survey and ask "what version of JVM do you use" I suspect that 1.8 will be overwhelmingly popular. The problem is the subset of people responding may not be representative of the population that is using it. From my experience working with bigger companies they view JDBC as a simple tool. The metaphor that comes to mind is a car with aftermarket wheels. You can't get any more for the car because you spend 2000 on the wheels. Cars come with wheels; similarly PostgreSQL comes with a JDBC driver. It occurs to me that maven actually exacerbates this as you don't even have to come to the site to get the driver now that you can just add the dependency to your pom and it will be automagically downloaded for you.

While writing this I am thinking that we might get away with saying we are only going to support 3 JVM versions so currently 1.6-1.8, once 1.9 comes out we can drop 1.6, however at the moment this is just a thought, not a policy. Thoughts ?

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 11:18, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
>Removing JDK 1.6 is not being contemplated at the moment

When do you think it is safe to remove JDK 1.6 support?

Vladimir

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
> The difficult part about this is that we could put up a survey and ask "what version of JVM do you use" I suspect
that1.8 will be overwhelmingly popular 

Can we get some access logs from https://jdbc.postgresql.org/download.html page?
It can give us an insight on how often people download "old" drivers.

Vladimir


Re: Pre-processing during build

From
John R Pierce
Date:
On 6/18/2015 11:59 AM, Vladimir Sitnikov wrote:
>> The difficult part about this is that we could put up a survey and ask "what version of JVM do you use" I suspect
that1.8 will be overwhelmingly popular 
> Can we get some access logs fromhttps://jdbc.postgresql.org/download.html  page?
> It can give us an insight on how often people download "old" drivers.

except for organizations like mine, where things like this were
downloaded a long time ago and bundled with internal releases.



--
john r pierce, recycling bits in santa cruz



Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
> except for organizations like mine, where things like this were downloaded a long time ago and bundled with internal
releases.

I do understand, the numbers would not be perfect. However that would
be at least something.

By the way, can you share if you are still using pgjdbc + java6? =)

Vladimir


Re: Pre-processing during build

From
Dave Cramer
Date:


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 15:09, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
> except for organizations like mine, where things like this were downloaded a long time ago and bundled with internal releases.

I do understand, the numbers would not be perfect. However that would
be at least something.

By the way, can you share if you are still using pgjdbc + java6? =)

I personally don't use any particular version. I provide professional services to companies that do.
 

Vladimir


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
John> and that stuff NEVER gets updated until the equipment is replaced.

That is cool. However, that means you don't care if pgjdbc drops
updates for JDK6, do you?

My theory is as follows: if there is a nice "decay of download rate",
then we might have a bit more educated guess on "the number of
supported JDKs"

Dave> I provide professional services to companies that do

Ok, let's try supporting JDK6.


So, the next step is to start from
https://github.com/pgjdbc/pgjdbc/pull/322 and try to split it into
several maven submodules.

The interesting question is if we want to those submodules (jdbc4,
jdbc41, jdbc42) to be public (in other words, allow users depend on
them) or if we consider them a pure implementation detail.
In the latter case, we might want to have some fixed name of the
module that includes the latest driver.

I would prefer to have a single artifact in the "public API" that
includes all the jdbc versions.

Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:
> If that won't work, nobody could write a Java EE program and compile
> and test it agains the official javaee.jar, as that one not even
> contains
ANY
> byte code but SOLELY declarations. And that one's an official JAR from
the
> makers of Java!

>Sorry, but that doesn't make any sense at all. The "official" JavaEE jar does contain bytecode: interfaces and some
supportingclasses like exceptions. 

Wrong. There are lots of classes in that JAR that actually are "broken" from the view of the JLS. When you try to
instantiatethem you'll simply get a weird exception telling you that for some of them the byte code is missing. This is
donewith several APIs of Java EE as due to the way it is specified in the particular specifications, it is up to the
vendorto provide them. So you can compile against that explicitly bytecode-dropped classes without any problem, but you
canneither test nor otherwise instantiate them. 

Regards
-Markus



Re: Pre-processing during build

From
"Markus KARG"
Date:
>Sorry, but you're the one that is wrong, it is not only about actually calling methods with types, it is about the
presenceor absence of those types when the JVM does decide to resolve the symbolic reference (eg when you reflect the
declaredmethods). I am about done with this discussion. I think the onus is on you to prove this scheme will work, not
forus to prove it won't work (which we already did). Your prove should not only include simple direct instance access,
butalso when using reflection **which is very common with JDBC drivers** (eg connection pools, tools/libraries that
bridgedifferences in JDBC implementations, etc). 

I agree that this thread is done, because I already provided a proof that my hypothesis works (the link was published
yesterday),and my hypothesis never said that reflection would work. Whether or not reflection is MANDATORY for JDBC
alsois a fruitless discussion, as all of you WANT it to be supported. 

>It sounds like you want to trade minor complexity in the build/IDE process for a world of hurt for the users of your
driver.I don't think that is a good way forward. 

I do not. You fear risks that do not exist if you go 100% with the JDBC specifications words, but I accept that you
liketo be safe from several uncertainties, so it is OK if we skip my idea and go with a different approach -- even when
Iam still convinced that it would be correct and working (but not for things you just WANT to support like reflection). 

-Markus



Re: Pre-processing during build

From
John R Pierce
Date:
On 6/18/2015 12:33 PM, Vladimir Sitnikov wrote:
> John> and that stuff NEVER gets updated until the equipment is replaced.
>
> That is cool. However, that means you don't care if pgjdbc drops
> updates for JDK6, do you?

as long as a legacy version remains available, sure, I dunno why not.
the older stuff is /very/ stable anyways, most of the work is towards
supporting new features of the new versions, is it not?



--
john r pierce, recycling bits in santa cruz



Re: Pre-processing during build

From
Dave Cramer
Date:
OK, I have access logs, They only go back as far as April 12 2015. A very cursory look shows many downloads of jars I would not have expected. Do we have any tools to summarize them ? Preferably command line

Dave

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 15:33, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
John> and that stuff NEVER gets updated until the equipment is replaced.

That is cool. However, that means you don't care if pgjdbc drops
updates for JDK6, do you?

My theory is as follows: if there is a nice "decay of download rate",
then we might have a bit more educated guess on "the number of
supported JDKs"

Dave> I provide professional services to companies that do

Ok, let's try supporting JDK6.


So, the next step is to start from
https://github.com/pgjdbc/pgjdbc/pull/322 and try to split it into
several maven submodules.

The interesting question is if we want to those submodules (jdbc4,
jdbc41, jdbc42) to be public (in other words, allow users depend on
them) or if we consider them a pure implementation detail.
In the latter case, we might want to have some fixed name of the
module that includes the latest driver.

I would prefer to have a single artifact in the "public API" that
includes all the jdbc versions.

Vladimir

Re: Pre-processing during build

From
Dave Cramer
Date:


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 16:08, John R Pierce <pierce@hogranch.com> wrote:
On 6/18/2015 12:33 PM, Vladimir Sitnikov wrote:
John> and that stuff NEVER gets updated until the equipment is replaced.

That is cool. However, that means you don't care if pgjdbc drops
updates for JDK6, do you?

as long as a legacy version remains available, sure, I dunno why not.    the older stuff is /very/ stable anyways, most of the work is towards supporting new features of the new versions, is it not?


Somewhat .. there is also work being done to support new features of Postgres, for instance json, and bidirectional copy (not implemented but new in pg) 




--
john r pierce, recycling bits in santa cruz



--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Pre-processing during build

From
"Markus KARG"
Date:
>It will also make things harder for tools that want to load your driver directly (instead of going through
DriverManager).

JDBC specification says that an application has to go through DriverManager or DataSource. Going a third way is not
coveredby the specification, hence is not JDBC. I would say these tools are not JDBC compliant, but it would be up to
Lanceto judge, not me. 



Re: Pre-processing during build

From
Kevin Carr
Date:

In hikaricp they have a java6 version for 1.6 and 1.7 and the main version has for 1.8. This has worked really well for our osgi installations.

On Thu, Jun 18, 2015, 3:09 PM Dave Cramer <pg@fastcrypt.com> wrote:
OK, I have access logs, They only go back as far as April 12 2015. A very cursory look shows many downloads of jars I would not have expected. Do we have any tools to summarize them ? Preferably command line

Dave

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 15:33, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
John> and that stuff NEVER gets updated until the equipment is replaced.

That is cool. However, that means you don't care if pgjdbc drops
updates for JDK6, do you?

My theory is as follows: if there is a nice "decay of download rate",
then we might have a bit more educated guess on "the number of
supported JDKs"

Dave> I provide professional services to companies that do

Ok, let's try supporting JDK6.


So, the next step is to start from
https://github.com/pgjdbc/pgjdbc/pull/322 and try to split it into
several maven submodules.

The interesting question is if we want to those submodules (jdbc4,
jdbc41, jdbc42) to be public (in other words, allow users depend on
them) or if we consider them a pure implementation detail.
In the latter case, we might want to have some fixed name of the
module that includes the latest driver.

I would prefer to have a single artifact in the "public API" that
includes all the jdbc versions.

Vladimir

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>Do we have any tools to summarize them ? Preferably command line

I usually use r + data.table + ggplot for that kind of analysis.

If you mean "summarize on postgresql.org server", then "extract jar
name somehow" | sort | uniq -c

Vladimir


Re: Pre-processing during build

From
Dave Cramer
Date:
Well this should scare you ... most notably the not insignificant number of /download/pgjdbc2.jar This is just one of the access logs, I'll try to find time to aggregate all of them

 17 /download/jdbc6.3.jar
     13 /download/jdbc6.4.jar
     15 /download/jdbc6.5-1.1.jar
     13 /download/jdbc6.5-1.2.jar
     15 /download/jdbc7.0-1.1.jar
     17 /download/jdbc7.0-1.2.jar
     12 /download/jdbc7.1-1.1.jar
     14 /download/jdbc7.1-1.2.jar
     16 /download/pg72jdbc1.jar
     18 /download/pg72jdbc2.jar
      1 /download/pg72jdbc2.jar%C2%A1%C2%A1%C2%A4%C2%AB%C2%A4%C3%A9%C2%A5%C3%80%C2%A5%C2%A6%C2%A5%C3%B3%C2%A5%C3%AD%C2%A1%C2%BC%C2%A5%C3%89%C2%A4%C2%B7
     20 /download/pg73jdbc1.jar
     16 /download/pg73jdbc2ee.jar
     20 /download/pg73jdbc2.jar
     24 /download/pg73jdbc3.jar
      4 /download/pg74.1jdbc3.jar
      2 /download/pg74.215.jdbc1.jar
      4 /download/pg74.215.jdbc2.jar
      1 /download/pg74.215.jdbc3.jar
     20 /download/pg74.216.jdbc1.jar
     15 /download/pg74.216.jdbc2ee.jar
     15 /download/pg74.216.jdbc2.jar
     31 /download/pg74.216.jdbc3.jar
      1 /download/pg74.216.jdbc3.jar&gt
      2 /download/pg80b1.308.jdbc3.jar
    468 /download/pgjdbc2.jar
      4 /download/postgresql-8.0-310.jdbc3.jar
      4 /download/postgresql-8.0-311.jdbc3.jar
      2 /download/postgresql-8.0-313.jdbc2.jar
      2 /download/postgresql-8.0-313.jdbc3.jar
      2 /download/postgresql-8.0-315.jdbc2ee.jar
      2 /download/postgresql-8.0-315.jdbc2.jar
      2 /download/postgresql-8.0-319.jdbc3.jar
     15 /download/postgresql-8.0-325.jdbc2ee.jar
     20 /download/postgresql-8.0-325.jdbc2.jar
     27 /download/postgresql-8.0-325.jdbc3.jar
      2 /download/postgresql-8.1-404.jdbc2.jar
      1 /download/postgresql-8.1-404.jdbc3.jar
      2 /download/postgresql-8.1-405.jdbc2.jar
      6 /download/postgresql-8.1-405.jdbc3.jar
      2 /download/postgresql-8.1-406.jdbc3.jar
      5 /download/postgresql-8.1-407.jdbc3.jar
      4 /download/postgresql-8.1-408.jdbc2.jar
      3 /download/postgresql-8.1-408.jdbc3.jar
      1 /download/postgresql-8.1-409.jdbc2ee.jar
      4 /download/postgresql-8.1-410.jdbc3.jar
      1 /download/postgresql-8.1-412.jdbc2.jar
      2 /download/postgresql-8.1-412.jdbc3.jar
      1 /download/postgresql-8.1-413.jdbc3.jar
      2 /download/postgresql-8.1-414.jdbc3.jar
     17 /download/postgresql-8.1-415.jdbc2ee.jar
     26 /download/postgresql-8.1-415.jdbc2.jar
     29 /download/postgresql-8.1-415.jdbc3.jar
      6 /download/postgresql-8.2-504.jdbc3.jar
      4 /download/postgresql-8.2-504.jdbc4.jar
      3 /download/postgresql-8.2-505.jdbc4.jar
      2 /download/postgresql-8.2-506.jdbc2.jar
      2 /download/postgresql-8.2-506.jdbc3.jar
      4 /download/postgresql-8.2-506.jdbc4.jar
      1 /download/postgresql-8.2-507.jdbc3g.jar
      2 /download/postgresql-8.2-507.jdbc3.jar
      2 /download/postgresql-8.2-507.jdbc4.jar
      2 /download/postgresql-8.2-508.jdbc3.jar
      4 /download/postgresql-8.2-508.jdbc3.jar%7CJDBC-PostgreSQL
      3 /download/postgresql-8.2-510.jdbc2ee.jar
      2 /download/postgresql-8.2-511.jdbc2.jar
     15 /download/postgresql-8.2-512.jdbc2ee.jar
     19 /download/postgresql-8.2-512.jdbc2.jar
     24 /download/postgresql-8.2-512.jdbc3.jar
     23 /download/postgresql-8.2-512.jdbc4.jar
      1 /download/postgresql-8.2dev-501.jdbc3.jar
      4 /download/postgresql-8.2dev-503.jdbc3.jar
      3 /download/postgresql-8.3-603.jdbc2.jar
     34 /download/postgresql-8.3-603.jdbc4.jar
      1 /download/postgresql-8.3-604.jdbc2ee.jar
     12 /download/postgresql-8.3-604.jdbc3.jar
     12 /download/postgresql-8.3-604.jdbc4.jar
      2 /download/postgresql-8.3-605.jdbc2ee.jar
      2 /download/postgresql-8.3-605.jdbc2.jar
     10 /download/postgresql-8.3-605.jdbc3.jar
      4 /download/postgresql-8.3-605.jdbc4.jar
      2 /download/postgresql-8.3-606.jdbc2ee.jar
      4 /download/postgresql-8.3-606.jdbc2.jar
      6 /download/postgresql-8.3-606.jdbc3.jar
      2 /download/postgresql-8.3-606.jdbc4.jar
     36 /download/postgresql-8.3-607.jdbc2ee.jar
     38 /download/postgresql-8.3-607.jdbc2.jar
     28 /download/postgresql-8.3-607.jdbc3.jar
     34 /download/postgresql-8.3-607.jdbc4.jar
      3 /download/postgresql-8.4-11.jdbc4.jar
     31 /download/postgresql-8.4-701.jdbc3.jar
     17 /download/postgresql-8.4-701.jdbc4.jar
      6 /download/postgresql-8.4-702.jdbc3.jar
     85 /download/postgresql-8.4-702.jdbc4.jar
     80 /download/postgresql-8.4-703.jdbc3.jar
    430 /download/postgresql-8.4-703.jdbc4.jar
      2 /download/postgresql-8.4-704.jdbc4.jar
      1 /download/postgresql-8.4dev-700.jdbc4.jar
      1 /download/postgresql-8.4.jdbc4.jar
     12 /download/postgresql-9.0-801.jdbc3.jar
     24 /download/postgresql-9.0-801.jdbc4.jar
      1 /download/postgresql-9.0-801.jdbc4.jar.
     52 /download/postgresql-9.0-802.jdbc3.jar
     95 /download/postgresql-9.0-802.jdbc4.jar
      7 /download/postgresql-9.1-901.jdbc3.jar
     67 /download/postgresql-9.1-901.jdbc4.jar
      2 /download/postgresql-9.1-902.jdbc3.jar
     25 /download/postgresql-9.1-902.jdbc4.jar
    149 /download/postgresql-9.1-903.jdbc3.jar
    252 /download/postgresql-9.1-903.jdbc4.jar
      6 /download/postgresql-9.2-1000.jdbc4.jar
      9 /download/postgresql-9.2-1001.jdbc4.jar
      2 /download/postgresql-9.2_1002/_/-%7D.jdbc4.jar
      5 /download/postgresql-9.2-1002.jdbc3.jar
   2973 /download/postgresql-9.2-1002.jdbc4.jar
      2 /download/postgresql-9.2-1002.jdbc4.jar&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
      2 /download/postgresql-9.2-1002.jdbc4.jar&quot;;
      1 /download/postgresql-9.2-1002.jdbc4.jar.sha1
   2102 /download/postgresql-9.2-1003.jdbc4.jar
     77 /download/postgresql-9.2-1004.jdbc3.jar
    162 /download/postgresql-9.2-1004.jdbc41.jar
    526 /download/postgresql-9.2-1004.jdbc4.jar
      1 /download/postgresql-9.3-1002.jdbc41.jar
      4 /download/postgresql-9.3-1100.jdbc3.jar
     59 /download/postgresql-9.3-1100.jdbc41.jar
     56 /download/postgresql-9.3-1100.jdbc4.jar
      8 /download/postgresql-9.3-1101.jdbc3.jar
    494 /download/postgresql-9.3-1101.jdbc41.jar
    284 /download/postgresql-9.3-1101.jdbc4.jar
     68 /download/postgresql-9.3-1102.jdbc3.jar
    851 /download/postgresql-9.3-1102.jdbc41.jar
      1 /download/postgresql-9.3-1102.jdbc41.jar%5C
   4165 /download/postgresql-9.3-1102.jdbc4.jar
      1 /download/postgresql-9.3-1102.jdbc4.jar/com/trend/iwss/jscan/runtime/Report?Reason=invoke+the+java%2Flang%2FSystem.exit%28%29+operation&URI=http%3A%2F%2Fjdbc.postgresql.org%2Fdownload%2Fpostgresql-9.3-1102.jdbc4.jar&Policyname=MMC+Default+Policy
      2 /download/postgresql-9.3-1102.jdbs4.jar
    872 /download/postgresql-9.3-1103.jdbc3.jar
      1 /download/postgresql-9.3-1103.jdbc3.jar/
    944 /download/postgresql-9.3-1103.jdbc41.jar
    330 /download/postgresql-9.3-1103.jdbc4.jar
      3 /download/postgresql-9.4-1000.jdbc4.jar
     80 /download/postgresql-9.4-1200.jdbc41.jar
    163 /download/postgresql-9.4-1200.jdbc4.jar
      2 /download/postgresql-9.4-1201.jbdc4.jar
      3 /download/postgresql-9.4-1201-jdbc41.jar
   2889 /download/postgresql-9.4-1201.jdbc41.jar
      1 /download//postgresql-9.4-1201.jdbc4.jar
   4180 /download/postgresql-9.4-1201.jdbc4.jar
     76 /download/postgresql-9.4-1201.jdbc4.jar/content.jar
     75 /download/postgresql-9.4-1201.jdbc4.jar/p2.index
      3 /download/postgresql-9.4-1201.jdbc.jar
      2 /download/postgresql-.jdbc3.jar
     73 /download/postgresql-jdbc-9.4-1201.src.tar.gz/content.jar
      2 /downloads/postgresql-9.3-1102.jdbc41.jar

Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

On 18 June 2015 at 16:18, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
>Do we have any tools to summarize them ? Preferably command line

I usually use r + data.table + ggplot for that kind of analysis.

If you mean "summarize on postgresql.org server", then "extract jar
name somehow" | sort | uniq -c

Vladimir

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>not insignificant number of /download/pgjdbc2.jar

Let's pretend they are either crawlers, or they are just not aware of
the fact that jbdc41 works for 8.1 as fine.
Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:
Look at this https://gist.github.com/mkarg/0da7f7dce8d9025511bb please.


-Markus

-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Mark Rotteveel
Sent: Donnerstag, 18. Juni 2015 08:38
To: List
Subject: Re: [JDBC] Pre-processing during build

On Wed, 17 Jun 2015 23:02:44 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> I think you not even need reflection, but could simply rely on proxies.
> Should be faster and do the job faster.
>
http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/reflect/Proxy.html
> :-)
>
> An official Java SE standard since v1.3, BTW, to prevent further
> discussions about versions. ;-)

Proxies use reflection (which is why they are part of the
java.lang.reflect package.

Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
> Look at this https://gist.github.com/mkarg/0da7f7dce8d9025511bb please.

Markus,

Your example here does not apply for JDBC driver for the reason
highlighted by Christopher: <<"javac" will complain that (for your
pre-Java-8 drivers) you haven't implemented everything>>.

The problem is _both_ I7 and I8 should be the same interface (they
both should be `java.sql.PreparedStatement`).

We do want to have "PS7 implements java.sql.PreparedStatement" _and_
"PS8 implements java.sql.PreparedStatement".
Can you please show how are you going to cover that?

If we leave "PS7 implements java.sql.PreparedStatement", then javac
would blame us for "not implementing enough methods".

Are you suggesting to use Proxy for all the JDBC interfaces
(Statement, ResultSet, etc, etc)?
I'm afraid that would hit the performance wall.

Well, I thought we have already settled that `java.lang.reflect.Proxy`
is good enough for `Driver`.
However, I do not like the idea of using j.l.r for ResultSet.

Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:

+1 :-)

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Christopher BROWN
Sent: Donnerstag, 18. Juni 2015 14:27
To: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Pre-processing during build

 

The entry point into a JDBC driver is, unsurprisingly, its implementation of java.sql.Driver (and javax.sql.DataSource if you want).  These are very straightforward and stable interfaces (there's only a new "getParentLogger()" method in 1.7), and they act as factories providing access to Connection objects.  Implementations of Driver, given that they are just factories for Connection objects, should never be on a critical performance path.

 

I see nothing wrong with implementing Driver using java.reflect.Proxy, and if we want to be paranoid about classloading, using reflection (within the implementation of the Proxy-based Driver) to instantiate and invoke methods on a version-specific implementation of Driver.  That way, even when using reflection, discovering an "unresolvable future type" is just impossible.  The Proxy-based Driver would delegate to some PGDriver6 / PGDriver7 / PGDriver8 which could be compiled in steps (as I described in an earlier message on this thread) and not to some PGSuperDriver, where PGDriver8 extends PGDriver7, and PGDriver7 extends PGDriver6, and where PGDriver8 returns (from the "connect" method) a PGConnection8 (extending PGConnection7), PGDriver7 returns PGConnection7, and so on.  So, even using reflection, it would be impossible to work with mismatched versions; the only exception being (for example in Java6 code) doing an explicit Class.forName("PGDriver8") but that sort of code just isn't possible except when the client is incompetent or malicious.  And, as I suggested in my previous message, it requires no code generation, just an extra compilation step per supported version.

 

As a side note, about the discussion on interpretations of the JLS, I've not yet encountered any JVM that attempts to fully-resolve all signatures as soon as it loads a class.  Does such a JVM exist?  I would assume that it would have poor performance for classloading (and a high memory overhead for class metadata, such as permgen/metaspace usage) because it would need to be recursive (for each type discovered in a method signature or body, it would need to load that type and all of its referenced types, recursively).  I'd be very interested to know how it deals with circular references (for example A refers to B and vice-versa ; it can't resolve A before B, and vice-versa), such as Object.toString() because String can't be loaded before Object, but Object would need to resolve String.

 

Getting back to the original discussion, I'm neutral on the Ant vs Maven debate, but it I'm sure that driver developers would benefit (readability and reliability) from plain old source code, simple class extending and step-by-step compilation to handle API evolution without trying relying on JVM behavior to have an all-in-one single implementation that may contain more than is advertised by specific interface versions.  This approach should also keep the build process deterministic and relatively straightforward and be able to produce a single driver artifact compatible with all JDBC versions, with no burden on driver users.


Hope that helps,
Christopher

 

 

On 18 June 2015 at 08:31, Mark Rotteveel <mark@lawinegevaar.nl> wrote:

On Wed, 17 Jun 2015 22:47:23 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> Sorry but you're wrong here.
>
> Vladimir's example was invalid. See
> https://gist.github.com/mkarg/88a89ae0dbffcfb7543e instead.
>
> Resolution will not fail. Even early static resolution won't. Check
again
> the chapter of JLS about the critera to throw the listed exceptions.
None
> of them is met with the corrected example
> https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.
>
> The direct example will NOT fail on Java 7 as the method is not changed,
> but added. The original method is still in place. Both methods exist at
the
> same time and differ by signature, hence the linker of the old program
> finds the old method in the new class. No problem at all.
>
> The indirect example will NOT fail on Java 7 as a JRE 7 client will
never
> pass in an instance of a JRE 8 type (how should even know of its
> existence?), and the Java 6 machine executing the invoked method will
never
> INSTANTIATE that type so it will not fail. No problem at all.
>
> Still should work. You'd possibly like to set up a proof using the
> corrected example https://gist.github.com/mkarg/88a89ae0dbffcfb7543e.
:-)

Sorry, but you're the one that is wrong, it is not only about actually
calling methods with types, it is about the presence or absence of those
types when the JVM does decide to resolve the symbolic reference (eg when
you reflect the declared methods). I am about done with this discussion. I
think the onus is on you to prove this scheme will work, not for us to
prove it won't work (which we already did). Your prove should not only
include simple direct instance access, but also when using reflection
**which is very common with JDBC drivers** (eg connection pools,
tools/libraries that bridge differences in JDBC implementations, etc).

It sounds like you want to trade minor complexity in the build/IDE process
for a world of hurt for the users of your driver. I don't think that is a
good way forward.


Mark


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

 

Re: Pre-processing during build

From
"Markus KARG"
Date:

Dave,

 

please remind that we're talking about JDK6 to drop, which was publicly EOL'ed two years ago, will end Oracle's "premier" support this year, end even end Oracle's "extended" support in two and a half years! Even Debian and zOS come with newer JREs. So actually I'd really love to see a public statistics on how many actively maintained JDBC applications CANNOT migrate easily to JRE 7 at least! Do you have any statistics available? :-)

 

-Markus

 

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Dave Cramer
Sent: Donnerstag, 18. Juni 2015 17:11
To: Christopher BROWN
Cc: List
Subject: Re: [JDBC] Pre-processing during build

 

 


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

 

On 18 June 2015 at 10:56, Christopher BROWN <brown@reflexe.fr> wrote:

If you decided to drop JDK6 support going forward, that wouldn't cause the current JDK6-compatible PostgreSQL driver versions to be deleted, so no-one would be stuck.  Even if new features are added to PostgreSQL (the database or the driver), the JDK6 JDBC API will remain constant and won't have any more or less features than before; there's probably enough expressivity in SQL and driver-specific classes to do pretty much anything anyway. 

 

These are just pros and cons to consider if you're thinking about this (I'm not recommending anything either way, especially as this tends to start flame wars).  Personally, I wouldn't be shocked by such a choice given that PostgreSQL and the driver are open-source and that JDK6 has been EOL for a long time, and that as such, I understand there are limited resources for supporting all possible configurations.

 

Anyway, like I said, even if there was a 9.5 or 9.6 version that was JDK7+, JDK6 users will still be able to use the 9.4 driver.

 

 

 

On 18 June 2015 at 15:52, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:

> even with source/target set to 1.6, "javac" will complain that (for your pre-Java-8 drivers) you haven't implemented everything

I see.
This indeed makes "step-by-step" compilation a good solution.

We might want to drop JDK6 support to make it easier for us and for
the contributors (1 JDK less to install).

Vladimir

 

 

Removing JDK 1.6 is not being contemplated at the moment. While many "hackers" are keen to use the latest greatest features of Java N+1, there is a VERY large population that still runs older JVM's for whatever reason. As I mentioned earlier in this thread the focus of the driver is to provide access to the all supported versions of PostgreSQL( and even older when possible ) to Java users, not to provide the latest Java tools for PostgreSQL. 

 

Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
>JDBC applications CANNOT migrate easily to JRE 7 at least! Do you have any statistics available? :-)

Dave has already shown the statistics.
The number of "jdbc4.jar" downloads is impressive.

We still have to support JRE7 anyway, so supporting JRE6 as well is
not that big deal. It is a bit painful for developers/contributors,
but if we support multiple JREs, then two or three does not make much
difference.

I suggest we just stop spending our time on that discussion.

We do not depend on other libraries that plan to drop JRE6 support, so
it looks like there is no hurry other than "simplification of a build
system a bit".

Vladimir


Re: Pre-processing during build

From
"Markus KARG"
Date:

I'm fine with dropping 1.6 when 1.9 is out, but why not simply looking at last month's JRE6-variant download numbers of the PGJDBC download host? It should clearly tell us if the number is in any kind statistically relevant.

 

BTW, we're maintaining software on 12.000+ machines running at 1.200+ enterprises worldwide. We switched ALL of them from Java 6 to Java 7 last year (even Oracle did remote auto-removes of Java 6, remember?). Nobody complained. No, sorry. A lot of them complained: They wanted to go directly to Java 8! These are real-world numbers taken from an industrial vendor. Just to give some statistical facts. :-)

 

-Markus

 

From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Dave Cramer
Sent: Donnerstag, 18. Juni 2015 20:33
To: Vladimir Sitnikov
Cc: Christopher BROWN; List
Subject: Re: [JDBC] Pre-processing during build

 

Vladimir,

 

To be honest I don't really know. The difficult part about this is that we could put up a survey and ask "what version of JVM do you use" I suspect that 1.8 will be overwhelmingly popular. The problem is the subset of people responding may not be representative of the population that is using it. From my experience working with bigger companies they view JDBC as a simple tool. The metaphor that comes to mind is a car with aftermarket wheels. You can't get any more for the car because you spend 2000 on the wheels. Cars come with wheels; similarly PostgreSQL comes with a JDBC driver. It occurs to me that maven actually exacerbates this as you don't even have to come to the site to get the driver now that you can just add the dependency to your pom and it will be automagically downloaded for you.

 

While writing this I am thinking that we might get away with saying we are only going to support 3 JVM versions so currently 1.6-1.8, once 1.9 comes out we can drop 1.6, however at the moment this is just a thought, not a policy. Thoughts ?


Dave Cramer

dave.cramer(at)credativ(dot)ca
http://www.credativ.ca

 

On 18 June 2015 at 11:18, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:

>Removing JDK 1.6 is not being contemplated at the moment

When do you think it is safe to remove JDK 1.6 support?

Vladimir

 

Re: Pre-processing during build

From
"Markus KARG"
Date:
If you downloaded long time ago then you never will notice us dropping JDK6 support, obviously, as we solely talk about
futurefeature releases of PJDBC not about your "old and already downloaded" software. 

-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of John R Pierce
Sent: Donnerstag, 18. Juni 2015 21:06
To: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Pre-processing during build

On 6/18/2015 11:59 AM, Vladimir Sitnikov wrote:
>> The difficult part about this is that we could put up a survey and ask "what version of JVM do you use" I suspect
that1.8 will be overwhelmingly popular 
> Can we get some access logs fromhttps://jdbc.postgresql.org/download.html  page?
> It can give us an insight on how often people download "old" drivers.

except for organizations like mine, where things like this were
downloaded a long time ago and bundled with internal releases.



--
john r pierce, recycling bits in santa cruz



--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
"Markus KARG"
Date:
+1 ok for me

-----Original Message-----
From: pgsql-jdbc-owner@postgresql.org [mailto:pgsql-jdbc-owner@postgresql.org] On Behalf Of Vladimir Sitnikov
Sent: Donnerstag, 18. Juni 2015 23:42
To: Markus KARG
Cc: List
Subject: Re: [JDBC] Pre-processing during build

>JDBC applications CANNOT migrate easily to JRE 7 at least! Do you have any statistics available? :-)

Dave has already shown the statistics.
The number of "jdbc4.jar" downloads is impressive.

We still have to support JRE7 anyway, so supporting JRE6 as well is
not that big deal. It is a bit painful for developers/contributors,
but if we support multiple JREs, then two or three does not make much
difference.

I suggest we just stop spending our time on that discussion.

We do not depend on other libraries that plan to drop JRE6 support, so
it looks like there is no hurry other than "simplification of a build
system a bit".

Vladimir


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc



Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Thu, 18 Jun 2015 14:32:51 -0400, Dave Cramer <pg@fastcrypt.com> wrote:
> Vladimir,
>
> To be honest I don't really know. The difficult part about this is that
we
> could put up a survey and ask "what version of JVM do you use" I suspect
> that 1.8 will be overwhelmingly popular. The problem is the subset of
> people responding may not be representative of the population that is
using
> it. From my experience working with bigger companies they view JDBC as a
> simple tool. The metaphor that comes to mind is a car with aftermarket
> wheels. You can't get any more for the car because you spend 2000 on the
> wheels. Cars come with wheels; similarly PostgreSQL comes with a JDBC
> driver. It occurs to me that maven actually exacerbates this as you
don't
> even have to come to the site to get the driver now that you can just
add
> the dependency to your pom and it will be automagically downloaded for
you.

If you use Sonatype OSS repository to upload the binaries to Maven, then
you can query the Maven central statistics on the dashboard, see
http://blog.sonatype.com/2010/12/now-available-central-download-statistics-for-oss-projects/

Mark


Re: Pre-processing during build

From
Mark Rotteveel
Date:
On Thu, 18 Jun 2015 18:53:49 +0200, "Markus KARG" <markus@headcrashing.eu>
wrote:
> Regarding your questions and just for couriosity, even being irrelevant
> due to the reflection topic meanwhile: My example also fails even
without
> (!) reflection once you remove the inner class and simply have the Java
8
> method implemented directly by the main class. Hence it proofs the
special
> restriction for main classes. That restriction is the difference between
> 12.1 (= how an initial class is loaded) and 12.2 (= how other classes
are
> loaded). It is an optional implementation choice of the Oracle JVM to do
> so: It demands that ALL references from main class to other classes MUST
> have existing byte code. This demand does not exists for classes NOT
> directly references but loaded lazily at a later time (following 12.2
rules
> ONLY but not 12.1 rules). Just change my example and you'll see the
effect
> - it fails immediately as soon as there is no wrapping inner class
anymore
> decoupling the missing classes from the main class.

JLS 12.1 is defined in terms of the normal class loading and
initialization that is described in later sections. They even say "We now
outline the steps the Java Virtual Machine may take to execute Test, as an
example of the loading, linking, and initialization processes that are
described further in later sections.". The only difference I see is section
12.1.4: invoking the main method. If you see other difference I'd like you
to point them out.

If you look at the stacktrace when method is directly in the main class,
you'll notice that the JVM itself uses reflection to obtain and execute the
main method. And that use of reflection is what causes it to fail
immediately as opposed to the case with a separate class.

Mark


Re: Pre-processing during build

From
Vladimir Sitnikov
Date:
I have one more question: what if we use true "pre-processing" during
build? (e.g. https://github.com/raydac/java-comment-preprocessor)

The suggestion is as follows:
1) "AbstractStatement implements java.sql.Statement"
2) JDK8 is used for development
3) During build, AbstractStatement code is pre-processed as per
"current jdk" (e.g. if compiling under java6, then java7 and java8
would be skipped out)


The only problem was we could not spell literally "AbstractStatement
implements java.sql.Statement" as different JDK versions have
different number of methods. Using a pre-procesor solves that out.

It would eliminate current jdbc3, jdbc3g, jdbc4, ... stuff.
It will make sane class hierarchies possible as well.
Currently we cannot have AbstractPreparedStatement extends
AbstractStatement since we have multiple different
AbstractJdbc2Statement and AbstractJdbc3Statement.

From my point of view "jdbcXX" packages serve no good except allowing
multiple-jre support. I think having a single "AbstractStatement
implements java.sql.Statement" would be a huge win in terms of ease of
development.

Vladimir


Re: Pre-processing during build

From
Dave Cramer
Date:
I think this is worth pursuing . What can I do to help ?


On 5 November 2015 at 19:33, Vladimir Sitnikov <sitnikov.vladimir@gmail.com> wrote:
I have one more question: what if we use true "pre-processing" during
build? (e.g. https://github.com/raydac/java-comment-preprocessor)

The suggestion is as follows:
1) "AbstractStatement implements java.sql.Statement"
2) JDK8 is used for development
3) During build, AbstractStatement code is pre-processed as per
"current jdk" (e.g. if compiling under java6, then java7 and java8
would be skipped out)


The only problem was we could not spell literally "AbstractStatement
implements java.sql.Statement" as different JDK versions have
different number of methods. Using a pre-procesor solves that out.

It would eliminate current jdbc3, jdbc3g, jdbc4, ... stuff.
It will make sane class hierarchies possible as well.
Currently we cannot have AbstractPreparedStatement extends
AbstractStatement since we have multiple different
AbstractJdbc2Statement and AbstractJdbc3Statement.

From my point of view "jdbcXX" packages serve no good except allowing
multiple-jre support. I think having a single "AbstractStatement
implements java.sql.Statement" would be a huge win in terms of ease of
development.

Vladimir