Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
Ahh I see, so basically just merge the two jars together and then let it deploy just the main artifact as normal?

Adbot
ADBOT LOVES YOU

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
In that case yes, but what I typically do is execute the ant stuff/wsdl/code generators in the build phase and have the output directory included in the the packaging phase so I don't have to dick around with a bunch of jars. In fact if it's just a simple Web Service you can use the org.codehaus.mojo jaxws-maven-plugin to all that for you.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
For some reason when I try to use jaxws-maven-plugin or cxf-codegen-plugin to generate java classes from the wsdl, the net result is not the same as when I use the custom wsdl to java program. So I think I'm stuck using the ant task to invoke their custom conversion utility thing.

I tried the maven-shade-plugin to combine the jars but it doesn't seem to be including the target/jar-from-wsdl.jar from my generate-sources phase. It looks like I can include it using the artifactSet but I'm not sure what the groupId & artifactId of my generated jar would be?

Sedro
Dec 31, 2008
Try putting your jar on the classpath using addjars.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Sedro posted:

Try putting your jar on the classpath using addjars.

Seems like such a hack though, there's no way to accomplish this with more widely used plugins?

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
I think I've got it working without using any additional plugins. The custom program that converts the wsdl to java can take an additional argument where you can specify the temp directory it sticks the .class & .java files before it creates the jar. I simply specified this to be ${project.build.outputDirectory}, and it appears to then get included in the build artifact from the default jar packaging.

Sedro
Dec 31, 2008
You must be new to maven...

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
Yup, don't have a lot of experience writing my own pom files. Probably would have found this a whole lot sooner had I known I could specify the temp directory of that drat third party utility...

Sedro
Dec 31, 2008
Oh, I was responding to your first post.

Modifying (or passing a property to) the ant script was the proper solution. But you didn't give any info about the ant script so I only had maven to work with.

Fly
Nov 3, 2002

moral compass
Why does anyone use maven by choice? (serious question) There must be something it's good at doing.

trex eaterofcadrs
Jun 17, 2005
My lack of understanding is only exceeded by my lack of concern.

Fly posted:

Why does anyone use maven by choice? (serious question) There must be something it's good at doing.

I use it because I also use OSGi and Maven makes deployment into the OSGi container vastly easier than just dumping dependencies in a hotfolder.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
This isn't YOSPOS so I just can't point at your post and laugh.

Here's just one thing it's good at doing: It finds and downloads loving transitive dependencies for you from the internet.

Fly
Nov 3, 2002

moral compass

Hard NOP Life posted:

This isn't YOSPOS so I just can't point at your post and laugh.

Here's just one thing it's good at doing: It finds and downloads loving transitive dependencies for you from the internet.

Why do people think that's a good thing though? In most environments I don't want my transitive dependencies being updated automatically. In fact, in many corporate environments, you have to get permission to update those dependencies, and you often don't have access to the Internet from your build script.

If the latest version of the "loving" dependency introduces bugs, then what? Who has vetted the change? It seems like a big mess rather than a help to me. That's why it's a serious question.

edit: The dependency downloading is the one thing that gives me the most grief. What else does it do that might be useful?

Fly fucked around with this message at 04:09 on Jun 25, 2013

trex eaterofcadrs
Jun 17, 2005
My lack of understanding is only exceeded by my lack of concern.

Fly posted:

edit: The dependency downloading is the one thing that gives me the most grief. What else does it do that might be useful?

It gives you a very widely adopted standard to version your own artifacts. You can run your own nexus and personally vet any artifacts it provides if you want. I used to use it for large-scale internal development projects (SOA systems mostly) and it was a godsend. Just always use Maven 3. Maven 2 was a piece of poo poo from hell. Also deploying FROM a nexus is usually pretty drat simple, which is again what I use it for. I can say "install artifact com.example/poo poo-maker/2.0.1" and it'll go do it.

It also has tons of plugins for doing lots of tedious poo poo. The only issue I have with them is that they are usually poorly documented (read the source) and the xml config for them can get wonky fast.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

trex eaterofcadrs posted:

It gives you a very widely adopted standard to version your own artifacts. You can run your own nexus and personally vet any artifacts it provides if you want.

So how does it interact with your actual version control solution? Is it some separate entity that you have to keep in sync?

Sedro
Dec 31, 2008
Maven doesn't automatically update your dependencies to a newer version unless you tell it to.

It just downloads your dependencies as needed, as opposed to checking them out of source control. The only semi-realistic fear is that the maven repositories you use will go offline, and you can set up your own proxy server to mitigate that.

Its problem is the build system. It only interfaces through plugins, so it's a huge ordeal to accomplish simple tasks. Want to copy a file? Read up on the copy-a-file plugin! Oh, there's two. Oh, they're not documented and the configuration isn't consistent with any other plugin.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe

Fly posted:

Why do people think that's a good thing though? In most environments I don't want my transitive dependencies being updated automatically. In fact, in many corporate environments, you have to get permission to update those dependencies, and you often don't have access to the Internet from your build script.

If the latest version of the "loving" dependency introduces bugs, then what? Who has vetted the change? It seems like a big mess rather than a help to me. That's why it's a serious question.
:psyduck: This is such a flawed argument and I'll explain why. All of the grievances and doubts you bring up are valid and need to be addressed but they need to be solved by your process and not your tools. Maven simplifies the mechanical and tedious process of downloading dependencies, checking that the hashes match, and including them in your final build output.

If you are updating foo 1.0 to 1.1 and it now declares a new hard dependency on bar 1.0 then you cannot choose to omit bar 1.0 without seriously affecting the foo library at runtime (most probably with a ClassNotFoundException). In general when a library is updated and it also bumps up it's dependencies it's for a good reason and probably won't work with the previous versions (unless it was just a minor revision number, but still you'd have to verify that yourself). So you can either trust that the author of the lib did the proper regression testing, or do it yourself in order to vet the change.

Also as Sedro said, Maven doesn't automatically update foo 1.0 to 1.1 on it's own. There is a misused feature (was never the default) that lets you set a minimum and no maximum version so Maven will always use the latest version, but that's idiotic for obvious reasons. When you declare a dependency you MUST declare the version that you want.

Finally on the subject of internal build systems, Maven by itself in a corporate setting isn't enough. You typically want to use another system (Nexus, Artifactory, etc.. ) to handle all the artifacts that you create and consume. This avoids the doomsday scenario of the maven repos going down, and adds another layer of security. Your'e build server would only require access to the artifact repository which can both sit behind your firewall.

The output of your build system (that ran all of your automated tests, you do have those right?) would be the versioned jars that are then fed into your artifact repo, which can now be referenced anywhere in your organization as a dependency, or if it's the final product then you're done and can publish it for your customers. After that the build process would bump the internal version of your projects to the next number and check that into your source control and you can start on the next big feature.

Fly
Nov 3, 2002

moral compass
I have no problem with a tool that will download and verify my dependencies. However, I don't want that to happen automatically (which you didn't bold above, and which is the most important modifier in my statement) in many environments. For one thing, I don't want any dependencies changing on me if I need to make a historical build. That is, the old versions of the dependency jars absolutely must be available if I want to make a build to match the one I created a year ago. That's why it's important to include jars in source control.

Another thing is that I don't want my build going out to the network unless I tell it to do so. This is another reason not to have automatic updates since those would need to check some network resource.

Maybe I should look at Maven 3. It could be nice to grab dependency updates, but only when I ask for them explicitly, such as when I make a new release tag/version in the source control system.

TheresaJayne
Jul 1, 2011

Fly posted:

Another thing is that I don't want my build going out to the network unless I tell it to do so. This is another reason not to have automatic updates since those would need to check some network resource.

Maybe I should look at Maven 3. It could be nice to grab dependency updates, but only when I ask for them explicitly, such as when I make a new release tag/version in the source control system.

Well Maven will get the latest if you do not specify the version you want.

If you define a version of your dependency then it will never update. but the nice thing is if you say want to use MyBatis with Spring any dependencies needed by the versions on Spring you are using will be automatically installed and you have locked your version to the one you want.

Also if you have it set up correctly you will only hit the network on first add, as after its downloaded once, it resides in the M2 cache.

Fly
Nov 3, 2002

moral compass

TheresaJayne posted:

Well Maven will get the latest if you do not specify the version you want.

If you define a version of your dependency then it will never update. but the nice thing is if you say want to use MyBatis with Spring any dependencies needed by the versions on Spring you are using will be automatically installed and you have locked your version to the one you want.

Also if you have it set up correctly you will only hit the network on first add, as after its downloaded once, it resides in the M2 cache.

Cool. Presumably one could prepopulate the cache with versions from source control, yes?

TheresaJayne
Jul 1, 2011

Fly posted:

Cool. Presumably one could prepopulate the cache with versions from source control, yes?

You can do a maven install which will install the jar/war file into the M2 cache for use in another project.

quote:

The Install Plugin is used during the install phase to add artifact(s) to the local repository. The Install Plugin uses the information in the POM (groupId, artifactId, version) to determine the proper location for the artifact within the local repository.

The local repository is the local cache where all artifacts needed for the build are stored. By default, it is located within the user's home directory (~/.m2/repository) but the location can be configured in ~/.m2/settings.xml using the <localRepository> element.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe

Fly posted:

I have no problem with a tool that will download and verify my dependencies. However, I don't want that to happen automatically (which you didn't bold above, and which is the most important modifier in my statement) in many environments. For one thing, I don't want any dependencies changing on me if I need to make a historical build. That is, the old versions of the dependency jars absolutely must be available if I want to make a build to match the one I created a year ago. That's why it's important to include jars in source control.
drat it, how thick headed are you? You're fixated on some imaginary scenario that is can only happen by following bad practices. If you declare a dependency on version 1 of some lib today and then two years in the future you revert to today's revision the POM will still declare the same lib version 1 and Maven will download that and it's declared dependencies at that point in time, not the current ones. It's not necessary to include your jars in source control.

Also as TheresaJayne said, it will only hit the network once to download everything and from then on if nothing changes it will never reach out to the network again. But even that isn't a hard requirement and you can seed the initial repository with your jars manually.

Fly posted:

Cool. Presumably one could prepopulate the cache with versions from source control, yes?

That's actually another reason for using Nexus or Artifactory, they let you install dependencies that weren't created with Maven and creates fake POMs for them so that you can refer to them from your projects like any other maven dependency, and it's all transparent. This is super useful for older libs that don't use Maven, commercial libs that don't have their artifacts published on the internet, and internal projects that you don't want to share with the outside world.

Fly
Nov 3, 2002

moral compass

Hard NOP Life posted:

drat it, how thick headed are you?
[edit: sorry] I'm glad some people find Maven helpful when they can set up all the best practices. Some of the features look nice.

I really would like to understand how it's useful and what can be done to have it bridge the practices of something that might have been an old Ant build, which itself was a conversion from an old gmake build of an old code base that certainly doesn't follow all best practices that would make Maven a natural fit. And all of that in an environment with strict proxies to the Internet or no Internet access for the build.

Fly fucked around with this message at 16:09 on Jun 25, 2013

trex eaterofcadrs
Jun 17, 2005
My lack of understanding is only exceeded by my lack of concern.

Jabor posted:

So how does it interact with your actual version control solution? Is it some separate entity that you have to keep in sync?

Well, for the most part they're orthogonal. One is the versioning and control of source (which includes the pom.xml maven config file) and the other is versioning and control of build artifacts. Once you tag a version in source control, you can use your continuous integration tool to automatically deploy the build artifacts to maven.

so my toolchain is like this

hg tag $version -> hudson sees new tag and generates build $build_no -> build $build_no passes tests, artifact $version.jar generated -> artifact $version.jar deployed to internal maven nexus

That said, Maven can also host source artifacts so you can have source up in the nexus with your jars and wars and whatever else.

Max Facetime
Apr 18, 2009

trex eaterofcadrs posted:

Also deploying FROM a nexus is usually pretty drat simple, which is again what I use it for. I can say "install artifact com.example/poo poo-maker/2.0.1" and it'll go do it.

This deserves to be emphasized. Having Maven integrated in your IDE and using it just to download libraries has lots of advantages over downloading the libraries yourself:

- No unzipping and copying, all .jars go straight to where they should go.
- No need to make changes to the classpath, it's automatic.
- Updating a library won't leave you with extreme-xml-0.7.3.jar and extreme-xml-0.7.4.jar in your /lib folder.
- No cleanup at all needed when removing a library.

And the big one in the IDE itself:

- Having the source code to the library and its Javadoc show up in the IDE is usually as simple as clicking a checkbox.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
So I guess the one big question I have left is around transitive dependencies. Are you relying on everyone who writes your libraries specifying exact versions of their dependencies in order to get hermetic, repeatable builds, or is there some mechanism that ensures you'd always get the same version of whatever transitive dependency even if you're building on a completely different machine a few years down the road?

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
In that case you can inspect your direct dependencies POMs to make sure they declared a single version of their dependencies. Again Artifactory and Nexus can help because you can overwrite their POM if it's wrong and they don't want to fix it.

Another thing I've seen is simply to declare the libraries transitive dependencies as direct dependencies which ensures a stable build no matter what. The downside is that now if you upgrade your dependency Maven won't tell you if any of it's transitive dependencies where updated.

A short workaround could be the follwing: an IDE like Netbeans would make it a snap since you could remove the transitive dependencies from your POM and Maven would get all the new ones if they changed and you can add them back using this handy context menu without having to edit your POM by hand each time.

trex eaterofcadrs
Jun 17, 2005
My lack of understanding is only exceeded by my lack of concern.

Hard NOP Life posted:

In that case you can inspect your direct dependencies POMs to make sure they declared a single version of their dependencies. Again Artifactory and Nexus can help because you can overwrite their POM if it's wrong and they don't want to fix it.

Another thing I've seen is simply to declare the libraries transitive dependencies as direct dependencies which ensures a stable build no matter what. The downside is that now if you upgrade your dependency Maven won't tell you if any of it's transitive dependencies where updated.

A short workaround could be the follwing: an IDE like Netbeans would make it a snap since you could remove the transitive dependencies from your POM and Maven would get all the new ones if they changed and you can add them back using this handy context menu without having to edit your POM by hand each time.


This is all true and also there's nothing stopping you from rebundling jars. I do this all the time because people don't make OSGi compatible bundles a lot of the time.

DholmbladRU
May 4, 2006
Hello I am having some trouble obtaining an li item when it has multiple classes. I have verified with the jsoup css validator that my syntax is correct when I write li.class or li.class.class2.


code:
<ul class="user-info">

<li class="class-one classtwo">stuff</li>`
stuff I tried
code:
doc.getElementsByClass(".class-one");
doc.getElementsByClass("li.class-one");
doc.getElementsByClass("li.class-one.classtwo");

doc.select("ul") <--- returns too many elements and I cant nest the selector with `.select(".user-info")`
nevermind, for some reason the html is completely different when I access via java code than when I access in FF/chrome.

DholmbladRU fucked around with this message at 18:56 on Jun 26, 2013

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
lol you want the Javascript thread!

DholmbladRU
May 4, 2006

Hard NOP Life posted:

lol you want the Javascript thread!

Yeah maybe... Jsoup is a java package that uses CSS and jquery like selectors. I figured out my problem and it was that the html being generated when I pulled it programatically was different from what I saw in a browser source.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
Hahaha I feel like such an rear end now for not reading post close enough, I swear I didn't see you mention jsoup. Without that, it looks like a normal javascript question. But I'm glad you figured out that it was grabbing a different version of the html.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
My eyes didn't see the word jsoup when I read that post either :)

DholmbladRU
May 4, 2006
yeah my bad, hard to see it had anything to do with java.. Are you telling me you guys haven't memorized all the available java packages syntax?...


Anyways if anyone is familiar with the jsoup package and wants to make some recommendations id appreciate it. Currently I am connecting to some website and pulling down data, eventually I will automate this so no one will be watching. At the moment I am not handling any connection, cookies, or referrer(no idea what this is) from the jsoup package. I am simply performing,

Document docRem = Jsoup.connect("http://www.google.com/stuff" + var).timeout(10*10000).get();

Is there a better way to handle this connection? Should I 'ping' it first with connect().execute() then check for a status 200?

DholmbladRU fucked around with this message at 22:04 on Jun 26, 2013

Sereri
Sep 30, 2008

awwwrigami

DholmbladRU posted:

yeah my bad, hard to see it had anything to do with java.. Are you telling me you guys haven't memorized all the available java packages syntax?...


Anyways if anyone is familiar with the jsoup package and wants to make some recommendations id appreciate it. Currently I am connecting to some website and pulling down data, eventually I will automate this so no one will be watching. At the moment I am not handling any connection, cookies, or referrer(no idea what this is) from the jsoup package. I am simply performing,

Document docRem = Jsoup.connect("http://www.google.com/stuff" + var).timeout(10*10000).get();

Is there a better way to handle this connection? Should I 'ping' it first with connect().execute() then check for a status 200?

Awful uses jSoup to parse the forum's html code but we handle all the network stuff (including getting the html) with the apache httpConnection library. On the other hand I have no ideas whether it's better for your case than just using jSoup.

DholmbladRU
May 4, 2006

Sereri posted:

Awful uses jSoup to parse the forum's html code but we handle all the network stuff (including getting the html) with the apache httpConnection library. On the other hand I have no ideas whether it's better for your case than just using jSoup.

Thanks for the information Ill do some tests to see which is better for my application.

Doctor w-rw-rw-
Jun 24, 2008

Sereri posted:

Awful uses jSoup to parse the forum's html code but we handle all the network stuff (including getting the html) with the apache httpConnection library. On the other hand I have no ideas whether it's better for your case than just using jSoup.

Tip - you should take a look at Volley for making http requests. For one, it properly manages swapping between apache on the old and httpurlconnection on the new OSes.

Sereri
Sep 30, 2008

awwwrigami

Doctor w-rw-rw- posted:

Tip - you should take a look at Volley for making http requests. For one, it properly manages swapping between apache on the old and httpurlconnection on the new OSes.

It's on the list, like so many things. I guess I'll create a ticket for it.

Opinion Haver
Apr 9, 2007

I'm trying to write a simple QR code reader using ZXing; I want to use the BufferedImageLuminanceSource that they already wrote for their client, so I built the javase jar, added it to my -classpath, and tried import com.google.zxing.client.j2se.BufferedImageLuminanceSource;, but it tells me that the package com.google.zxing.client.j2se doesn't exist when I try to compile my code. If I run jar tf javase-2.3-SNAPSHOT.jar, I can see com/google/zxing/client/j2se/BufferedImageLuminanceSource.class. What am I doing wrong?

Adbot
ADBOT LOVES YOU

Tesseraction
Apr 5, 2009

What operating system / IDE (if any) are you using?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply