Ahh I see, so basically just merge the two jars together and then let it deploy just the main artifact as normal?
|
|
# ? Jun 24, 2013 23:08 |
|
|
# ? Jun 12, 2024 11:44 |
|
In that case yes, but what I typically do is execute the ant stuff/wsdl/code generators in the build phase and have the output directory included in the the packaging phase so I don't have to dick around with a bunch of jars. In fact if it's just a simple Web Service you can use the org.codehaus.mojo jaxws-maven-plugin to all that for you.
|
# ? Jun 24, 2013 23:13 |
For some reason when I try to use jaxws-maven-plugin or cxf-codegen-plugin to generate java classes from the wsdl, the net result is not the same as when I use the custom wsdl to java program. So I think I'm stuck using the ant task to invoke their custom conversion utility thing. I tried the maven-shade-plugin to combine the jars but it doesn't seem to be including the target/jar-from-wsdl.jar from my generate-sources phase. It looks like I can include it using the artifactSet but I'm not sure what the groupId & artifactId of my generated jar would be?
|
|
# ? Jun 24, 2013 23:59 |
|
Try putting your jar on the classpath using addjars.
|
# ? Jun 25, 2013 00:10 |
Sedro posted:Try putting your jar on the classpath using addjars. Seems like such a hack though, there's no way to accomplish this with more widely used plugins?
|
|
# ? Jun 25, 2013 02:37 |
I think I've got it working without using any additional plugins. The custom program that converts the wsdl to java can take an additional argument where you can specify the temp directory it sticks the .class & .java files before it creates the jar. I simply specified this to be ${project.build.outputDirectory}, and it appears to then get included in the build artifact from the default jar packaging.
|
|
# ? Jun 25, 2013 02:49 |
|
You must be new to maven...
|
# ? Jun 25, 2013 02:51 |
Yup, don't have a lot of experience writing my own pom files. Probably would have found this a whole lot sooner had I known I could specify the temp directory of that drat third party utility...
|
|
# ? Jun 25, 2013 03:04 |
|
Oh, I was responding to your first post. Modifying (or passing a property to) the ant script was the proper solution. But you didn't give any info about the ant script so I only had maven to work with.
|
# ? Jun 25, 2013 03:36 |
|
Why does anyone use maven by choice? (serious question) There must be something it's good at doing.
|
# ? Jun 25, 2013 03:51 |
|
Fly posted:Why does anyone use maven by choice? (serious question) There must be something it's good at doing. I use it because I also use OSGi and Maven makes deployment into the OSGi container vastly easier than just dumping dependencies in a hotfolder.
|
# ? Jun 25, 2013 04:00 |
|
This isn't YOSPOS so I just can't point at your post and laugh. Here's just one thing it's good at doing: It finds and downloads loving transitive dependencies for you from the internet.
|
# ? Jun 25, 2013 04:03 |
|
Hard NOP Life posted:This isn't YOSPOS so I just can't point at your post and laugh. Why do people think that's a good thing though? In most environments I don't want my transitive dependencies being updated automatically. In fact, in many corporate environments, you have to get permission to update those dependencies, and you often don't have access to the Internet from your build script. If the latest version of the "loving" dependency introduces bugs, then what? Who has vetted the change? It seems like a big mess rather than a help to me. That's why it's a serious question. edit: The dependency downloading is the one thing that gives me the most grief. What else does it do that might be useful? Fly fucked around with this message at 04:09 on Jun 25, 2013 |
# ? Jun 25, 2013 04:07 |
|
Fly posted:edit: The dependency downloading is the one thing that gives me the most grief. What else does it do that might be useful? It gives you a very widely adopted standard to version your own artifacts. You can run your own nexus and personally vet any artifacts it provides if you want. I used to use it for large-scale internal development projects (SOA systems mostly) and it was a godsend. Just always use Maven 3. Maven 2 was a piece of poo poo from hell. Also deploying FROM a nexus is usually pretty drat simple, which is again what I use it for. I can say "install artifact com.example/poo poo-maker/2.0.1" and it'll go do it. It also has tons of plugins for doing lots of tedious poo poo. The only issue I have with them is that they are usually poorly documented (read the source) and the xml config for them can get wonky fast.
|
# ? Jun 25, 2013 04:22 |
|
trex eaterofcadrs posted:It gives you a very widely adopted standard to version your own artifacts. You can run your own nexus and personally vet any artifacts it provides if you want. So how does it interact with your actual version control solution? Is it some separate entity that you have to keep in sync?
|
# ? Jun 25, 2013 05:04 |
|
Maven doesn't automatically update your dependencies to a newer version unless you tell it to. It just downloads your dependencies as needed, as opposed to checking them out of source control. The only semi-realistic fear is that the maven repositories you use will go offline, and you can set up your own proxy server to mitigate that. Its problem is the build system. It only interfaces through plugins, so it's a huge ordeal to accomplish simple tasks. Want to copy a file? Read up on the copy-a-file plugin! Oh, there's two. Oh, they're not documented and the configuration isn't consistent with any other plugin.
|
# ? Jun 25, 2013 05:13 |
|
Fly posted:Why do people think that's a good thing though? In most environments I don't want my transitive dependencies being updated automatically. In fact, in many corporate environments, you have to get permission to update those dependencies, and you often don't have access to the Internet from your build script. If you are updating foo 1.0 to 1.1 and it now declares a new hard dependency on bar 1.0 then you cannot choose to omit bar 1.0 without seriously affecting the foo library at runtime (most probably with a ClassNotFoundException). In general when a library is updated and it also bumps up it's dependencies it's for a good reason and probably won't work with the previous versions (unless it was just a minor revision number, but still you'd have to verify that yourself). So you can either trust that the author of the lib did the proper regression testing, or do it yourself in order to vet the change. Also as Sedro said, Maven doesn't automatically update foo 1.0 to 1.1 on it's own. There is a misused feature (was never the default) that lets you set a minimum and no maximum version so Maven will always use the latest version, but that's idiotic for obvious reasons. When you declare a dependency you MUST declare the version that you want. Finally on the subject of internal build systems, Maven by itself in a corporate setting isn't enough. You typically want to use another system (Nexus, Artifactory, etc.. ) to handle all the artifacts that you create and consume. This avoids the doomsday scenario of the maven repos going down, and adds another layer of security. Your'e build server would only require access to the artifact repository which can both sit behind your firewall. The output of your build system (that ran all of your automated tests, you do have those right?) would be the versioned jars that are then fed into your artifact repo, which can now be referenced anywhere in your organization as a dependency, or if it's the final product then you're done and can publish it for your customers. After that the build process would bump the internal version of your projects to the next number and check that into your source control and you can start on the next big feature.
|
# ? Jun 25, 2013 05:45 |
|
I have no problem with a tool that will download and verify my dependencies. However, I don't want that to happen automatically (which you didn't bold above, and which is the most important modifier in my statement) in many environments. For one thing, I don't want any dependencies changing on me if I need to make a historical build. That is, the old versions of the dependency jars absolutely must be available if I want to make a build to match the one I created a year ago. That's why it's important to include jars in source control. Another thing is that I don't want my build going out to the network unless I tell it to do so. This is another reason not to have automatic updates since those would need to check some network resource. Maybe I should look at Maven 3. It could be nice to grab dependency updates, but only when I ask for them explicitly, such as when I make a new release tag/version in the source control system.
|
# ? Jun 25, 2013 14:02 |
|
Fly posted:Another thing is that I don't want my build going out to the network unless I tell it to do so. This is another reason not to have automatic updates since those would need to check some network resource. Well Maven will get the latest if you do not specify the version you want. If you define a version of your dependency then it will never update. but the nice thing is if you say want to use MyBatis with Spring any dependencies needed by the versions on Spring you are using will be automatically installed and you have locked your version to the one you want. Also if you have it set up correctly you will only hit the network on first add, as after its downloaded once, it resides in the M2 cache.
|
# ? Jun 25, 2013 14:07 |
|
TheresaJayne posted:Well Maven will get the latest if you do not specify the version you want. Cool. Presumably one could prepopulate the cache with versions from source control, yes?
|
# ? Jun 25, 2013 14:54 |
|
Fly posted:Cool. Presumably one could prepopulate the cache with versions from source control, yes? You can do a maven install which will install the jar/war file into the M2 cache for use in another project. quote:The Install Plugin is used during the install phase to add artifact(s) to the local repository. The Install Plugin uses the information in the POM (groupId, artifactId, version) to determine the proper location for the artifact within the local repository.
|
# ? Jun 25, 2013 15:00 |
|
Fly posted:I have no problem with a tool that will download and verify my dependencies. However, I don't want that to happen automatically (which you didn't bold above, and which is the most important modifier in my statement) in many environments. For one thing, I don't want any dependencies changing on me if I need to make a historical build. That is, the old versions of the dependency jars absolutely must be available if I want to make a build to match the one I created a year ago. That's why it's important to include jars in source control. Also as TheresaJayne said, it will only hit the network once to download everything and from then on if nothing changes it will never reach out to the network again. But even that isn't a hard requirement and you can seed the initial repository with your jars manually. Fly posted:Cool. Presumably one could prepopulate the cache with versions from source control, yes? That's actually another reason for using Nexus or Artifactory, they let you install dependencies that weren't created with Maven and creates fake POMs for them so that you can refer to them from your projects like any other maven dependency, and it's all transparent. This is super useful for older libs that don't use Maven, commercial libs that don't have their artifacts published on the internet, and internal projects that you don't want to share with the outside world.
|
# ? Jun 25, 2013 15:33 |
|
Hard NOP Life posted:drat it, how thick headed are you? I really would like to understand how it's useful and what can be done to have it bridge the practices of something that might have been an old Ant build, which itself was a conversion from an old gmake build of an old code base that certainly doesn't follow all best practices that would make Maven a natural fit. And all of that in an environment with strict proxies to the Internet or no Internet access for the build. Fly fucked around with this message at 16:09 on Jun 25, 2013 |
# ? Jun 25, 2013 15:40 |
|
Jabor posted:So how does it interact with your actual version control solution? Is it some separate entity that you have to keep in sync? Well, for the most part they're orthogonal. One is the versioning and control of source (which includes the pom.xml maven config file) and the other is versioning and control of build artifacts. Once you tag a version in source control, you can use your continuous integration tool to automatically deploy the build artifacts to maven. so my toolchain is like this hg tag $version -> hudson sees new tag and generates build $build_no -> build $build_no passes tests, artifact $version.jar generated -> artifact $version.jar deployed to internal maven nexus That said, Maven can also host source artifacts so you can have source up in the nexus with your jars and wars and whatever else.
|
# ? Jun 25, 2013 15:43 |
|
trex eaterofcadrs posted:Also deploying FROM a nexus is usually pretty drat simple, which is again what I use it for. I can say "install artifact com.example/poo poo-maker/2.0.1" and it'll go do it. This deserves to be emphasized. Having Maven integrated in your IDE and using it just to download libraries has lots of advantages over downloading the libraries yourself: - No unzipping and copying, all .jars go straight to where they should go. - No need to make changes to the classpath, it's automatic. - Updating a library won't leave you with extreme-xml-0.7.3.jar and extreme-xml-0.7.4.jar in your /lib folder. - No cleanup at all needed when removing a library. And the big one in the IDE itself: - Having the source code to the library and its Javadoc show up in the IDE is usually as simple as clicking a checkbox.
|
# ? Jun 25, 2013 19:01 |
|
So I guess the one big question I have left is around transitive dependencies. Are you relying on everyone who writes your libraries specifying exact versions of their dependencies in order to get hermetic, repeatable builds, or is there some mechanism that ensures you'd always get the same version of whatever transitive dependency even if you're building on a completely different machine a few years down the road?
|
# ? Jun 25, 2013 23:42 |
|
In that case you can inspect your direct dependencies POMs to make sure they declared a single version of their dependencies. Again Artifactory and Nexus can help because you can overwrite their POM if it's wrong and they don't want to fix it. Another thing I've seen is simply to declare the libraries transitive dependencies as direct dependencies which ensures a stable build no matter what. The downside is that now if you upgrade your dependency Maven won't tell you if any of it's transitive dependencies where updated. A short workaround could be the follwing: an IDE like Netbeans would make it a snap since you could remove the transitive dependencies from your POM and Maven would get all the new ones if they changed and you can add them back using this handy context menu without having to edit your POM by hand each time.
|
# ? Jun 26, 2013 00:48 |
|
Hard NOP Life posted:In that case you can inspect your direct dependencies POMs to make sure they declared a single version of their dependencies. Again Artifactory and Nexus can help because you can overwrite their POM if it's wrong and they don't want to fix it. This is all true and also there's nothing stopping you from rebundling jars. I do this all the time because people don't make OSGi compatible bundles a lot of the time.
|
# ? Jun 26, 2013 01:52 |
|
Hello I am having some trouble obtaining an li item when it has multiple classes. I have verified with the jsoup css validator that my syntax is correct when I write li.class or li.class.class2. code:
code:
DholmbladRU fucked around with this message at 18:56 on Jun 26, 2013 |
# ? Jun 26, 2013 18:42 |
|
lol you want the Javascript thread!
|
# ? Jun 26, 2013 19:18 |
|
Hard NOP Life posted:lol you want the Javascript thread! Yeah maybe... Jsoup is a java package that uses CSS and jquery like selectors. I figured out my problem and it was that the html being generated when I pulled it programatically was different from what I saw in a browser source.
|
# ? Jun 26, 2013 20:04 |
|
Hahaha I feel like such an rear end now for not reading post close enough, I swear I didn't see you mention jsoup. Without that, it looks like a normal javascript question. But I'm glad you figured out that it was grabbing a different version of the html.
|
# ? Jun 26, 2013 20:33 |
My eyes didn't see the word jsoup when I read that post either
|
|
# ? Jun 26, 2013 21:04 |
|
yeah my bad, hard to see it had anything to do with java.. Are you telling me you guys haven't memorized all the available java packages syntax?... Anyways if anyone is familiar with the jsoup package and wants to make some recommendations id appreciate it. Currently I am connecting to some website and pulling down data, eventually I will automate this so no one will be watching. At the moment I am not handling any connection, cookies, or referrer(no idea what this is) from the jsoup package. I am simply performing, Document docRem = Jsoup.connect("http://www.google.com/stuff" + var).timeout(10*10000).get(); Is there a better way to handle this connection? Should I 'ping' it first with connect().execute() then check for a status 200? DholmbladRU fucked around with this message at 22:04 on Jun 26, 2013 |
# ? Jun 26, 2013 21:12 |
|
DholmbladRU posted:yeah my bad, hard to see it had anything to do with java.. Are you telling me you guys haven't memorized all the available java packages syntax?... Awful uses jSoup to parse the forum's html code but we handle all the network stuff (including getting the html) with the apache httpConnection library. On the other hand I have no ideas whether it's better for your case than just using jSoup.
|
# ? Jun 27, 2013 12:47 |
|
Sereri posted:Awful uses jSoup to parse the forum's html code but we handle all the network stuff (including getting the html) with the apache httpConnection library. On the other hand I have no ideas whether it's better for your case than just using jSoup. Thanks for the information Ill do some tests to see which is better for my application.
|
# ? Jun 27, 2013 14:46 |
|
Sereri posted:Awful uses jSoup to parse the forum's html code but we handle all the network stuff (including getting the html) with the apache httpConnection library. On the other hand I have no ideas whether it's better for your case than just using jSoup. Tip - you should take a look at Volley for making http requests. For one, it properly manages swapping between apache on the old and httpurlconnection on the new OSes.
|
# ? Jun 28, 2013 10:07 |
|
Doctor w-rw-rw- posted:Tip - you should take a look at Volley for making http requests. For one, it properly manages swapping between apache on the old and httpurlconnection on the new OSes. It's on the list, like so many things. I guess I'll create a ticket for it.
|
# ? Jun 28, 2013 12:32 |
|
I'm trying to write a simple QR code reader using ZXing; I want to use the BufferedImageLuminanceSource that they already wrote for their client, so I built the javase jar, added it to my -classpath, and tried import com.google.zxing.client.j2se.BufferedImageLuminanceSource;, but it tells me that the package com.google.zxing.client.j2se doesn't exist when I try to compile my code. If I run jar tf javase-2.3-SNAPSHOT.jar, I can see com/google/zxing/client/j2se/BufferedImageLuminanceSource.class. What am I doing wrong?
|
# ? Jun 30, 2013 00:08 |
|
|
# ? Jun 12, 2024 11:44 |
|
What operating system / IDE (if any) are you using?
|
# ? Jun 30, 2013 00:10 |