Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
baquerd
Jul 2, 2007

by FactsAreUseless

Exigent posted:

Is the OP still accurate? Should I watch that video series to begin focusing more on Java?

If you want to start off as a godless, Eclipse-using heathen, they are probably still OK. That way, you will be able to better appreciate the ways IntelliJ is better when you eventually switch.

But really, they're a fine introduction to things, but the official Java stuff isn't bad either: https://docs.oracle.com/javase/tutorial/

Adbot
ADBOT LOVES YOU

FateFree
Nov 14, 2003

Spring STS really made Eclipse much less of a headache. Its basically Eclipse with all the necessary Spring/Git plugins and aspect J support. Works wonderful with spring boot projects.

a few DRUNK BONERS
Mar 25, 2016

Let's say your're making a game, and you have an abstract Item class, and a bunch of item classes that extend Item:

code:
public class Apple extends Item {...}

public class Wood extends Item {...}
Now you want to display a list of all items to the user, in a dropdown menu or something (plus be able to create objects from that menu). I thought of putting them all in an enum:

code:
public enum ItemEnum {
    APPLE("Apple", Apple.class), WOOD("Wood", Wood.class);

    private String name;
    private Class class;
	
    private ItemEnum(String name, Class class) {
    	this.name = name;
    	this.class = class;

    public Item constructObject() {
		try {
			Constructor<?> ctor = class.getConstructor();
			Object object = ctor.newInstance(new Object[] {  });
			return (Item) object;
		} catch (NoSuchMethodException | SecurityException e) {
			e.printStackTrace();
		} catch (InstantiationException e) {
			e.printStackTrace();
		} catch (IllegalAccessException e) {
			e.printStackTrace();
		} catch (IllegalArgumentException e) {
			e.printStackTrace();
		} catch (InvocationTargetException e) {
			e.printStackTrace();
		}
		return null;
	}
}
Now I have to remember to add an item to the enum whenever I create a new class. This seems obviously wrong to me. What is the correct way to do this?

FateFree
Nov 14, 2003

If all the items are in the same package you can use some reflection to get a list of all the classes in that package. If not, you can have all your items implement an interface (or just search for classes that extend the Item base class) and scan over multiple packages until you find classes that implement that interface. Reflection in general would probably make this easier for you.

Fergus Mac Roich
Nov 5, 2008

Soiled Meat

a few DRUNK BONERS posted:

Let's say your're making a game, and you have an abstract Item class, and a bunch of item classes that extend Item:

code:
public class Apple extends Item {...}

public class Wood extends Item {...}
Now you want to display a list of all items to the user, in a dropdown menu or something (plus be able to create objects from that menu). I thought of putting them all in an enum:

code:
public enum ItemEnum {
    APPLE("Apple", Apple.class), WOOD("Wood", Wood.class);

    private String name;
    private Class class;
	
    private ItemEnum(String name, Class class) {
    	this.name = name;
    	this.class = class;

    public Item constructObject() {
		try {
			Constructor<?> ctor = class.getConstructor();
			Object object = ctor.newInstance(new Object[] {  });
			return (Item) object;
		} catch (NoSuchMethodException | SecurityException e) {
			e.printStackTrace();
		} catch (InstantiationException e) {
			e.printStackTrace();
		} catch (IllegalAccessException e) {
			e.printStackTrace();
		} catch (IllegalArgumentException e) {
			e.printStackTrace();
		} catch (InvocationTargetException e) {
			e.printStackTrace();
		}
		return null;
	}
}
Now I have to remember to add an item to the enum whenever I create a new class. This seems obviously wrong to me. What is the correct way to do this?

Sounds like your items are actually data.

smackfu
Jun 7, 2004

Does anyone know why the heck property files aren't UTF-8 encoded by default? Just seems weird considering the rest of Java.

more like dICK
Feb 15, 2010

This is inevitable.
I poked around a bit without finding a good answer. Legacy or compatibility concerns maybe.

I did find tickets about the issue dating back to 1997 though https://bugs.openjdk.java.net/browse/JDK-4084757 which is funny.

The eventual solution (landing in Java 9) https://bugs.openjdk.java.net/browse/JDK-8043553

smackfu
Jun 7, 2004

I have a bit of a general best practices question...

I have two model objects, a Department and an Employee. Each one is persisted in a different database table. The employee table has a department id.

How should the Employee object represent the department? It seems like it should have a Department object, but instantiating a Department requires me to also query the department table. This seems like it could easily get out of hand if there are a lot of related tables. OTOH, I could have the department id as an attribute of the Employee, and a getDepartment() method to retrieve the full object (with some caching), but that seems to be introducing implementation details into my models.

How does Hibernate/JPA deal with this?

Volguus
Mar 3, 2009

smackfu posted:

I have a bit of a general best practices question...

I have two model objects, a Department and an Employee. Each one is persisted in a different database table. The employee table has a department id.

How should the Employee object represent the department? It seems like it should have a Department object, but instantiating a Department requires me to also query the department table. This seems like it could easily get out of hand if there are a lot of related tables. OTOH, I could have the department id as an attribute of the Employee, and a getDepartment() method to retrieve the full object (with some caching), but that seems to be introducing implementation details into my models.

How does Hibernate/JPA deal with this?

Lazy fetching. Whenever you call getDepartment() the Department table will be queried automatically, but not before. Careful though, the transaction still needs to be open. If at one time you have a need for an employee where you know in advance that a getDepartment() will be also needed, you can eager-fetch the department from the JQL.

Zorro KingOfEngland
May 7, 2008

Lazy Loading, which can be specified in the hbm.xml file or via annotations. Basically, if you query for an Employee hibernate will only run a query that looks like this:

code:
SELECT * FROM Employee WHERE EmployeeID=1
Then later if you call employee.getDepartment(), it'll run a separate query that looks like this:

code:
SELECT * FROM Department WHERE DepartmentID=5
But if you never invoke that method, the second query will never be run. This saves your database a bit of work initially, but could result in multiple round trips depending on how your program is using the lazily loaded fields.

smackfu
Jun 7, 2004

So Employee has both department_id (always present) and department (null until accessed) fields? I guess that's what is tripping me up.

Hargrimm
Sep 22, 2011

W A R R E N

smackfu posted:

So Employee has both department_id (always present) and department (null until accessed) fields? I guess that's what is tripping me up.

With hibernate, the entity just has a department field, which is annotated with the name of the db column containing the id (key) to use in fetching.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

smackfu posted:

So Employee has both department_id (always present) and department (null until accessed) fields? I guess that's what is tripping me up.

There's two ways you can do it. The first is you can have both department and department_id fields, yes. The problem is that Hibernate can only have one set of getter/setter functions actually mapped through to any particular database column. If you have both department and department_id fields then one of them needs to have insert="false" update="false" attributes in its .hbm.xml file or the equivalent annotations. And then you need to make sure that you are not trying to do inserts/updates on the field that's been set as read-only, and be sure that they don't get out of sync.

The solution I've arrived at in the past is to have the property (the ID) be the "master" and have the entity ("department") be the read-only one, since when you call setDepartment it definitely has the appropriate Department object already and you can get the Id from that, whereas if you call setDepartmentId then it doesn't have the Department and you would have to call the database. But I don't know if this is actually the right way, it's entirely possible Hibernate is smart enough to mark the entity for lazy-loading if you change the underlying foreign, key.

The other way you can do it is to just have the entity ("department"). You know how Hibernate can lazy-load fields? It also knows when you call a getter to get the primary key on an entity to which you have a foreign-key. So if you call "getDepartment().getId()" it will just return the "department_id" foreign key instead of making a database call. If you're writing from scratch, I really think this is the better way.

The key takeaway here, by the way, is that Hibernate is hooking your getters/setters and what they actually do is totally different from how the raw code looks. I wouldn't advise putting a bunch of logic in getters/setters unless you setup some tests first to be sure they behave how you want. You should also probably default to using the getters/setters rather than touching the underlying values for anything except the getters/setters to be sure Hibernate has its chance to hook things appropriately (it may be OK to deviate from this, but again, test).

I typically have very anemic data classes, I generally try to follow the MVC pattern even for non-web applications. The model will just be data holders that closely mirror the database structure. Then I have some "controller" that does works with them, and presents a "view" of some kind (perhaps a HTTP response for an API-like application). Teachers don't like it, but I think that's much more sane than true "object oriented" programming where everything has to know how to interact with everything else and you end up with an exponential blowout of the amount of code you need to maintain. The benefits/limitations here tilt pretty heavily towards the "anemic" model for most modern applications. After all, if your code is small then who cares, and if you are large then scalability becomes important. Apart from architectural purism, why would you care about whether the main business logic lives in a controller or right in the class file, if it works and is maintainable?

quote:

Benefits

Clear separation between logic and data[3] (procedural programming).
Works well for simple applications.
Results in stateless logic, which facilitates scaling out.
Avoids the need for a complex OO-Database mapping layer.

Liabilities

Logic cannot be implemented in a truly object-oriented way.
Violation of the encapsulation and information hiding principles.
Needs a separate business layer to contain the logic otherwise located in a domain model. It also means that domain model's objects cannot guarantee their correctness at any moment, because their validation and mutation logic is placed somewhere outside (most likely in multiple places).
Needs a service layer when sharing domain logic across differing consumers of an object model.
Makes a model less expressive.
https://en.wikipedia.org/wiki/Anemic_domain_model

Paul MaudDib fucked around with this message at 03:45 on Jan 11, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Somehow I think I may be the only person who prefers XML configuration over annotations for Hibernate? Has nobody else tried to map the same entity to two different formats before, say one is JSON that you take in over the wire and the other is Hibernate that you store in your database?

Carbon dioxide
Oct 9, 2012

I have a question about Java web templating engines.

I know of four of them: JSP, Apache Velocity, Freemarker, and Thymeleaf.

Having somewhere between a little bit and a lot of experience with each of them, as far as I can see, they're all incredibly similar, with the main difference being the syntax. Each has some equivalent of 'if' and 'for each' tags and each one allows you to insert variables from your Java code.

So why is it people recommend moving to the younger technologies such as Thymeleaf? People will say things like: "lol if you're still using JSP in 2017". One reason I've heard is that JSP uses a bit more server memory than the others but that doesn't seem too big of a deal either.

So, what makes the younger template engines such as Thymeleaf better than the older ones?

E: fixed grammar

Carbon dioxide fucked around with this message at 11:28 on Jan 21, 2017

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
I believe that you can unit test the newer technologies which you can't really do with JSPs, but I still use JSP :shrug:

CPColin
Sep 9, 2003

Big ol' smile.
The part I really liked about Thymeleaf was that the template can be valid HTML, so our designers could return to them and make changes, even after all the processing hooks were added. You can't do that with JSP.

ToxicSlurpee
Nov 5, 2003

-=SEND HELP=-


Pillbug

CPColin posted:

The part I really liked about Thymeleaf was that the template can be valid HTML, so our designers could return to them and make changes, even after all the processing hooks were added. You can't do that with JSP.

This is a big one. The other snag is that JSP has some real XSS issues that I don't think will ever go away. JSP is a pretty old technology that hasn't aged well.

Apache Velocity hasn't been updated since 2010 so it's safe to say that one is dead.

Data Graham
Dec 28, 2009

📈📊🍪😋



In my experience, and I may be completely misunderstanding how it works, but I've spent years trying to make my way through the impenetrable fog of 15-year-old documentation and incoherent developer forum posts, and would love to be corrected, but...

JSP tells you to do database operations and queries in the templating language (EL), i.e. <sql:query> . Which is fine if you just want to do simple queries and loop through the results and spit them out into HTML. But the moment you try to do anything more complex, like I dunno doing some math operations on your query results, or formatting strings, or doing file manipulations, you have to pass that data into the scriptlet space (<% %>). Which is a completely separate world from the EL layer, with its own namespace and its own context and its own everything. And never the twain shall meet.

You can't pass complex variables (like query result rows) to or from scriptlets. The best you can do is read attributes from the pageContext and set them back as attributes into the pageContext after you're done with your scriptlet, which makes for insanely clunky and repetitious code. Or at least I've found no simple way to do it; you have to cast and prep and attribute-pass every individual value separately.

People writing how-tos always say you're never supposed to do any serious complex code in scriptlets; you're supposed to write beans and build DAOs and stuff to feed the templates from the back-end. Which is great, that's how I'd much rather do it, if it weren't a massive amount of code to write in itself. But JSP is very insistent that you do your <sql:query> stuff in the EL layer, not in the back-end. So basically there are two competing philosophies battling it out: simple query magic in the EL template, and full-fledged DAO modeling in the Java layer. And JSP doesn't help you out at all if you're doing the latter; you're basically starting from scratch and may as well use some other MVC framework like Struts or something and forget about the EL templating stuff entirely.

Again, I could be completely missing some huge major piece of the puzzle. But I've been fighting with JSP for one project of mine for years now and it makes me weep every time I want to do something as simple as write my gently massaged query results to a logfile, and I cannot believe anyone would want anyone to go through it the way I have, though for the life of me that's the only conclusion I can draw from the docs/community.

geeves
Sep 16, 2004

Paul MaudDib posted:

Somehow I think I may be the only person who prefers XML configuration over annotations for Hibernate? Has nobody else tried to map the same entity to two different formats before, say one is JSON that you take in over the wire and the other is Hibernate that you store in your database?

I don't mind it. But it's pretty much all I've used. Also we're stuck on Hibernate 3. We basically just have a small script that will generate the XML and the Bean from the table.

But we started using more JSON via REST to deliver content so I'm using more SQL and Hibernate's Transform which I personally find much more useful for what I've been working on because it's all self-contained in one method.

Carbon dioxide
Oct 9, 2012

Data Graham posted:

In my experience, and I may be completely misunderstanding how it works, but I've spent years trying to make my way through the impenetrable fog of 15-year-old documentation and incoherent developer forum posts, and would love to be corrected, but...

JSP tells you to do database operations and queries in the templating language (EL), i.e. <sql:query> . Which is fine if you just want to do simple queries and loop through the results and spit them out into HTML. But the moment you try to do anything more complex, like I dunno doing some math operations on your query results, or formatting strings, or doing file manipulations, you have to pass that data into the scriptlet space (<% %>). Which is a completely separate world from the EL layer, with its own namespace and its own context and its own everything. And never the twain shall meet.

You can't pass complex variables (like query result rows) to or from scriptlets. The best you can do is read attributes from the pageContext and set them back as attributes into the pageContext after you're done with your scriptlet, which makes for insanely clunky and repetitious code. Or at least I've found no simple way to do it; you have to cast and prep and attribute-pass every individual value separately.

People writing how-tos always say you're never supposed to do any serious complex code in scriptlets; you're supposed to write beans and build DAOs and stuff to feed the templates from the back-end. Which is great, that's how I'd much rather do it, if it weren't a massive amount of code to write in itself. But JSP is very insistent that you do your <sql:query> stuff in the EL layer, not in the back-end. So basically there are two competing philosophies battling it out: simple query magic in the EL template, and full-fledged DAO modeling in the Java layer. And JSP doesn't help you out at all if you're doing the latter; you're basically starting from scratch and may as well use some other MVC framework like Struts or something and forget about the EL templating stuff entirely.

Again, I could be completely missing some huge major piece of the puzzle. But I've been fighting with JSP for one project of mine for years now and it makes me weep every time I want to do something as simple as write my gently massaged query results to a logfile, and I cannot believe anyone would want anyone to go through it the way I have, though for the life of me that's the only conclusion I can draw from the docs/community.

Thanks for the effort post.

The context I used jsp's was a CMS framework so it's relatively simply to get a new page working. Then again, more often than not I have to do a lot of manipulating in the page controller class to make the right data available to the jsp.

venutolo
Jun 4, 2003

Dinosaur Gum
This isn't a Java question, but involves Jenkins so I figure this is the best place to put it.

We had been using Mercurial and Kiln for as long as I've been at this job. The development workflow did not involve using branching, so all changes were made to the "default" branch (which basically is the equivalent of master for a Git repo, if you are not familiar with Mercurial). Every time a change was pushed to the Kiln repo, something (a script, I think) triggered a new Maven build in Jenkins. Then when a developer was done with the work for a new version of the project, the developer would do a release build through Jenkins by clicking on the "Perform Maven Release" link in the Jenkins project. Since all changes were made to one branch, we never had to deal with building different branches in Jenkins.

We have recently migrated to Git and GitHub Enterprise and adopted a development workflow that include branching. For a given GitHub repo and Jenkins project, Jenkins is currently configured to build all branches, and the Jenkins GitHub plugin registers a webhook for the GitHub repo that is triggered on pushes. (I believe that webhook just tells Jenkins that something has been pushed and then Jenkins polls the repo.) So every time a change is pushed to any branch in the repo it triggers a build in Jenkins, which is good.

The issue we have run into is that when you manually start a build from Jenkins, via Jenkins' "Build Now" or "Perform Maven Release" buttons, it just builds from the most recent push to the repo and not necessarily the most recent push to the master branch. This has not been a problem as we rarely use "Build Now" as non-release builds are triggered automatically, and when we use the "Perform Maven Release" button, it is after we've merged whatever development branch(es) into master so that the merge into master is the most recent commit across all branches in the repo.

We only came across this problem when someone merged a development branch into master to prepare for a release, made a commit to another development branch, and then did a Maven release in Jenkins. When that Jenkins release build kicked off, the commit into the other development branch was the most recent commit, so Jenkins made a release from that branch and not master.

So, ideally what I'd like is that Jenkins builds after pushes to all branches (edit: to be clear, I mean to build only the branch which has been pushed to), but when using "Build Now" or "Perform Maven Release" the user has to specify a branch to build. I can set the Jenkins project to be "parameterized" with a String parameter for the branch to build that defaults to "master" and then using that parameter in the Git "Branches to build" option. Then when the user does a build manually in Jenkins, they must set the branch and the default value is "master". However, with this setting, the automatic builds of non-master branches stops as nothing supplies the branch parameter and it defaults to master.

I feel like this scenario of wanting to automatically build all branches but wanting to specify which branch to use when doing a manual build can't be too rare. My Google-fu has not lead me to a solution. So does anyone have a similar setup that works, or any suggestions for something that accomplishes the same goals?

venutolo fucked around with this message at 17:57 on Jan 24, 2017

BabyFur Denny
Mar 18, 2003
I actually don't see the point in building all branches on every push. That only works if your build time is low, you don't have many branches or don't push that often. Why not have one job that only builds the branch that was pushed to / specified manually, and another one that does a nightly build of all branches?

venutolo
Jun 4, 2003

Dinosaur Gum

BabyFur Denny posted:

I actually don't see the point in building all branches on every push. That only works if your build time is low, you don't have many branches or don't push that often. Why not have one job that only builds the branch that was pushed to / specified manually, and another one that does a nightly build of all branches?

My apologies if it wasn't clear and I've edited my post. In terms of automatic builds, I only want to build the branch that was just pushed, not all branches on any push to any branch. As it is now, that happens and I'm happy with it. GitHub notifies Jenkins that there was a push. Jenkins polls the repo and sees that it has already built the most recent change to all but the just-pushed branch, and then does a build for that branch. Our build times are low and we typically don't have more than a very few branches for any repo.

I'm specifically interested if anyone has experience with Jenkins and GitHub Enterprise and the associated plugins that has set up something similar as I have been unable to figure out a configuration that works as I want.

poemdexter
Feb 18, 2005

Hooray Indie Games!

College Slice
You should have separate jobs per branch and you can set the branch in Jenkins job config. Use this for automatically doing builds. We only watch the development branch and the release branch for automatic builds and each has it's own job.



Here's the Continuous Integration/build engineering/devops thread: https://forums.somethingawful.com/showthread.php?threadid=3695559

venutolo
Jun 4, 2003

Dinosaur Gum

poemdexter posted:

You should have separate jobs per branch and you can set the branch in Jenkins job config. Use this for automatically doing builds. We only watch the development branch and the release branch for automatic builds and each has it's own job.



Here's the Continuous Integration/build engineering/devops thread: https://forums.somethingawful.com/showthread.php?threadid=3695559

Thanks. I was hoping not to have to do separate Jenkins jobs for different branches.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe

venutolo posted:

Thanks. I was hoping not to have to do separate Jenkins jobs for different branches.

lol I have like 50 Jenkins jobs, one for each release branch.

poemdexter
Feb 18, 2005

Hooray Indie Games!

College Slice

venutolo posted:

Thanks. I was hoping not to have to do separate Jenkins jobs for different branches.

When you create a new job, there's an option to clone a previous job. Use that, change one string, poof you're done.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope
I can't find a library that does 3D delaunay triangulation in Java and I refuse to believe one doesn't exist.

Look man, I just want to input a List<3DPoint> or something to a method and get a List<Tetrahedron> back.

I also want the library to have a maven dependency I can just poop into my pom.xml.

Lastly, I want a pony.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Wheany posted:

I can't find a library that does 3D delaunay triangulation in Java and I refuse to believe one doesn't exist.

Look man, I just want to input a List<3DPoint> or something to a method and get a List<Tetrahedron> back.

I also want the library to have a maven dependency I can just poop into my pom.xml.

Lastly, I want a pony.

Patches welcome!

poemdexter
Feb 18, 2005

Hooray Indie Games!

College Slice

Wheany posted:

I can't find a library that does 3D delaunay triangulation in Java and I refuse to believe one doesn't exist.

Look man, I just want to input a List<3DPoint> or something to a method and get a List<Tetrahedron> back.

I also want the library to have a maven dependency I can just poop into my pom.xml.

Lastly, I want a pony.

I don't know much about 3d delaunay triangulation, but maybe this will help? http://download.java.net/media/java3d/javadoc/1.5.1/com/sun/j3d/utils/geometry/Triangulator.html

Maven dependency: https://mvnrepository.com/artifact/java3d/j3d-core-utils/1.5.1

pony:

poemdexter fucked around with this message at 21:25 on Jan 31, 2017

Carbon dioxide
Oct 9, 2012


I think your pony inherits from the wrong class.

baka kaba
Jul 19, 2003

PLEASE ASK ME, THE SELF-PROFESSED NO #1 PAUL CATTERMOLE FAN IN THE SOMETHING AWFUL S-CLUB 7 MEGATHREAD, TO NAME A SINGLE SONG BY HIS EXCELLENT NU-METAL SIDE PROJECT, SKUA, AND IF I CAN'T PLEASE TELL ME TO
EAT SHIT

But pony extends vehicle, I'm not seeing a problem here??

Data Graham
Dec 28, 2009

📈📊🍪😋



Looks like a mixin to me.

Mindisgone
May 18, 2011

Yeah, well you know...
That's just like, your opinion man.

a few DRUNK BONERS posted:

Let's say your're making a game, and you have an abstract Item class, and a bunch of item classes that extend Item:

code:
public class Apple extends Item {...}

public class Wood extends Item {...}
Now you want to display a list of all items to the user, in a dropdown menu or something (plus be able to create objects from that menu). I thought of putting them all in an enum:

code:
public enum ItemEnum {
    APPLE("Apple", Apple.class), WOOD("Wood", Wood.class);

    private String name;
    private Class class;
	
    private ItemEnum(String name, Class class) {
    	this.name = name;
    	this.class = class;

    public Item constructObject() {
		try {
			Constructor<?> ctor = class.getConstructor();
			Object object = ctor.newInstance(new Object[] {  });
			return (Item) object;
		} catch (NoSuchMethodException | SecurityException e) {
			e.printStackTrace();
		} catch (InstantiationException e) {
			e.printStackTrace();
		} catch (IllegalAccessException e) {
			e.printStackTrace();
		} catch (IllegalArgumentException e) {
			e.printStackTrace();
		} catch (InvocationTargetException e) {
			e.printStackTrace();
		}
		return null;
	}
}
Now I have to remember to add an item to the enum whenever I create a new class. This seems obviously wrong to me. What is the correct way to do this?

You actually want to use an "item" interface instead of extending the base class. Generally you should try and program to an interface not an implementation. Then in your logic you can always refer to the item interface instead of a concrete class.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

poemdexter posted:

I don't know much about 3d delaunay triangulation, but maybe this will help? http://download.java.net/media/java3d/javadoc/1.5.1/com/sun/j3d/utils/geometry/Triangulator.html

That looks like the wrong kind of triangulator. I guess tetrahedralization is actually the correct term, I am dumb.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope
I have decided to use https://github.com/visad/visad/blob/master/core/src/visad/Delaunay.java, with its lovely 1998 vintage code.

Not surprisingly, it doesn't seem be available on Maven.

I just.. really don't want to write this code myself.

Jo
Jan 24, 2005

:allears:
Soiled Meat
I've got a method which operates on a parallel Java 8 stream.

If I do this:

code:
public void elementOp_i(UnaryOperator<Double> op) {
		data = Arrays.stream(data).map(x -> op.apply(x)).toArray();
It seems to work, but I'm wondering if there's a way to avoid that re-assignment to data, since it should be possible to just modify all the values in-place.

Something like Arrays.stream(data).map(op).toArray(data)?

I see stuff like collect(Collection().toList, ...), but that seems to apply if mapping to a new array. I just want to operate in-place and using only .map() doesn't seem to do it.

CPColin
Sep 9, 2003

Big ol' smile.
For that, you probably either can't use an array or you can't use a stream.

Adbot
ADBOT LOVES YOU

The Laplace Demon
Jul 23, 2009

"Oh dear! Oh dear! Heisenberg is a douche!"
I don't think it's possible to get around needing the index. This is the kind of thing I've written or seen for these problems:
Java code:
IntStream.range(0, data.length).forEach(i -> data[i] = op.apply(data[i]));

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply