Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
I don't see how this has anything at all to do with what git-annex is designed for.

Adbot
ADBOT LOVES YOU

aerique
Jul 16, 2008
To weigh in on the Git + Dropbox combination. On a first look it might seem like a good idea and then you'll ask for opinions on a forum and people will tell you it is a stupid thing to do with well-founded arguments so you don't do it and continue looking for alternatives.

And they're right. You shouldn't use it for your company's code base, BUT I've been using this combo for years for personal projects and holy poo poo is it convenient (and it hasn't failed my yet).

At first I initialized a bare repo on Dropbox and pushed and pulled from that, but that is actually a bad idea. However, having a repo somewhere else (f.e. GitHub) and cloning that into a Dropbox folder and working from there is so very convenient.

(I do this while developing for Linux, OS X and Windows.)

wolffenstein
Aug 2, 2002
 
Pork Pro
as long as you don't use the sharing features of your preferred cloud sync software anywhere within your development directories, it's perfectly fine for personal development needs.

down with slavery
Dec 23, 2013
STOP QUOTING MY POSTS SO PEOPLE THAT AREN'T IDIOTS DON'T HAVE TO READ MY FUCKING TERRIBLE OPINIONS THANKS

wolffenstein posted:

as long as you don't use the sharing features of your preferred cloud sync software anywhere within your development directories, it's perfectly fine for personal development needs.

Why not? You can share an entire directory and any browser-side technologies (js, html, css) will work just fine.

wolffenstein
Aug 2, 2002
 
Pork Pro
Your VCS and you will go nuts once modifications are made in a shared folder. Imagine trying to work on separate branches. If you aren't using VCS, then it works fine until two or more people edit a file at the same time.

You might be confusing my terminology, but most cloud sync services allow you to mark a folder as shared and others can edit it. Sharing a read-only link is perfectly fine; you're the only one that can edit the file.

wolffenstein fucked around with this message at 22:22 on Jan 21, 2014

wwb
Aug 17, 2004

aerique posted:

To weigh in on the Git + Dropbox combination. On a first look it might seem like a good idea and then you'll ask for opinions on a forum and people will tell you it is a stupid thing to do with well-founded arguments so you don't do it and continue looking for alternatives.

And they're right. You shouldn't use it for your company's code base, BUT I've been using this combo for years for personal projects and holy poo poo is it convenient (and it hasn't failed my yet).

Point taken but I fail to see how git + dropbox is more convenient than, say, bitbucket using git.

aerique
Jul 16, 2008

wwb posted:

Point taken but I fail to see how git + dropbox is more convenient than, say, bitbucket using git.

I don't know, I've never used bitbucket. I was just putting down an opinion against people saying Git + Dropbox should never be done.

Huragok
Sep 14, 2011
Just spin up a $5/mo. Digital Ocean droplet to make somewhere to sync to if you don't want source hosted by Github, bitbucket etc.

evensevenone
May 12, 2001
Glass is a solid.

wwb posted:

Point taken but I fail to see how git + dropbox is more convenient than, say, bitbucket using git.

If you use multiple machines, git can get a little bit iffy. I.e. say you are machine A and you push some changes. Then later on machine B you were working on that branch and forgot to fetch. You commit some changes and then decide to rebase. Then you push, but because you had rebased, you had to do a git push --force (because it wasn't a fastforward). Now whatever you had on machine A was lost (or will be lost once you pull), and git wouldn't have warned you because you had done --force.

This really bites you if you set up your remotes as mirror=push and you don't even have to do --force to do non-FF pushes.

having a single working directory/repo that is synced though other means (i.e. dropbox or whatever) prevents that problem, because git is only dealing with a single working directory/index. I don't think it's a good idea per se, but there is a gap there that is filled by syncing rather than using git 100%.

aerique
Jul 16, 2008
Yeah, it is nice that one doesn't have to push changes if you're in the middle of something and have to reach it from different locations (but not at once), say when going from home to work.

Like I said: convenient, nothing more.

wwb
Aug 17, 2004

evensevenone posted:

If you use multiple machines, git can get a little bit iffy. I.e. say you are machine A and you push some changes. Then later on machine B you were working on that branch and forgot to fetch. You commit some changes and then decide to rebase. Then you push, but because you had rebased, you had to do a git push --force (because it wasn't a fastforward). Now whatever you had on machine A was lost (or will be lost once you pull), and git wouldn't have warned you because you had done --force.

This really bites you if you set up your remotes as mirror=push and you don't even have to do --force to do non-FF pushes.

having a single working directory/repo that is synced though other means (i.e. dropbox or whatever) prevents that problem, because git is only dealing with a single working directory/index. I don't think it's a good idea per se, but there is a gap there that is filled by syncing rather than using git 100%.

I'm probably just old school, and used to working on 3+ computers, but I never have had this problem outside of being too drunk to remember to push things.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

evensevenone posted:

If you use multiple machines, git can get a little bit iffy. I.e. say you are machine A and you push some changes. Then later on machine B you were working on that branch and forgot to fetch. You commit some changes and then decide to rebase. Then you push, but because you had rebased, you had to do a git push --force (because it wasn't a fastforward). Now whatever you had on machine A was lost (or will be lost once you pull), and git wouldn't have warned you because you had done --force.

This really bites you if you set up your remotes as mirror=push and you don't even have to do --force to do non-FF pushes.

having a single working directory/repo that is synced though other means (i.e. dropbox or whatever) prevents that problem, because git is only dealing with a single working directory/index. I don't think it's a good idea per se, but there is a gap there that is filled by syncing rather than using git 100%.

If you're force pushing by default you get exactly what you deserve. I'm not even sure how you're rebasing such that you would end up in this situation without knowing what was happening.

Never stop using git log --all --graph --decorate all day every day.

evensevenone
May 12, 2001
Glass is a solid.
It's not that it isn't totally preventable with a little bit of care, it's just annoying that it requires thinking about at all.

I use a rebase workflow instead of a merge workflow so force pushes are needed. You'd think git could tell which pushes are pure rebases or rebases+fastforwards and allow those, and just require --force when commits are actually being lost or changed, but I just don't think its designed that way.

Anyway the gist of it is that if you work on multiple machines, you can't use --mirror=push for your remotes safely, and if you use certain workflows, you have to use --force and --force isn't safe either.

Edison was a dick
Apr 3, 2010

direct current :roboluv: only
A recent release of git added the --force-with-lease push option, this is like --force, but if your remote tracking branch is different to that on the remote server (i.e. you haven't fetched your refs before pushing) then it will fail.

It's presumably not the default yet since it's still experimental. Strictly they're only saying that --force-with-lease=$branch:$commit_id is stable behaviour, but --force-with-lease=$branch defaults to --force-with-lease=$branch:refs/remotes/$remote/$branch, and --force-with-lease without a branch defaults to all branches.

Also, don't ask me about the name, I have no idea why it's called that, it was called --lockref earlier in its development, and I was more interested in its functionality when I was writing scripts to sync multiple repositories and rollback if there's an unexpected change, than a safer --force option.

fartmanteau
Mar 15, 2007

I migrated the work repo and workflow to git/gerrit/cgit today, with tons of documentation and references.

Feels good man.

o.m. 94
Nov 23, 2009

evensevenone posted:

If you use multiple machines, git can get a little bit iffy. I.e. say you are machine A and you push some changes. Then later on machine B you were working on that branch and forgot to fetch. You commit some changes and then decide to rebase. Then you push, but because you had rebased, you had to do a git push --force (because it wasn't a fastforward). Now whatever you had on machine A was lost (or will be lost once you pull), and git wouldn't have warned you because you had done --force.

This really bites you if you set up your remotes as mirror=push and you don't even have to do --force to do non-FF pushes.

having a single working directory/repo that is synced though other means (i.e. dropbox or whatever) prevents that problem, because git is only dealing with a single working directory/index. I don't think it's a good idea per se, but there is a gap there that is filled by syncing rather than using git 100%.

This isn't git's fault, imo - it will tell you that your remote branch has diverged and not allow the push. That's warning enough that something ain't right, surely?

Gazpacho
Jun 18, 2004

by Fluffdaddy
Slippery Tilde
I have to agree that falls clearly under the "then don't do that" heading. You should have pulled, but you forgot, so the warning is there to remind you to pull.

uh zip zoom
May 28, 2003

Sensitive Thugs Need Hugs

Hey, I want to get my feet wet in version control, but unfortunately the OP hasn't been edited since 2009. (Which I can't really criticize, because I haven't updated the post from my SQL thread in about that long.)

Can you recommend some basic software that would be good for a first timer, maybe a website or a book on the subject that I could read? Don't get me wrong, the concept of version control makes a great deal of sense to me, but when you start throwing around terms like "push" or "lease" or "branch," I start to get a little daunted and start thinking things like "well what if I just kept a copy of the website in a folder on my desktop and work from that?"

Anyway, where do I begin?

nielsm
Jun 1, 2009



http://git-scm.com/book

If you're using Mac or Windows I'd recommend also getting Atlassian SourceTree for a graphical interface.

o.m. 94
Nov 23, 2009

uh zip zoom posted:

Don't get me wrong, the concept of version control makes a great deal of sense to me, but when you start throwing around terms like "push" or "lease" or "branch," I start to get a little daunted and start thinking things like "well what if I just kept a copy of the website in a folder on my desktop and work from that?"

1. Why copy a folder to your desktop every single time you make a change? What if you want to look at an older copy? Can you remember if you added something yesterday? Which folder? It becomes unmanageable very quickly. Recording your changes (committing) is just one command and it handles all this and more for you.

2. With a system like git, you can type "git checkout -b <mybranchname>" and start hacking away on your code to try something new. Then, if your experiment is a failure you can type "git checkout master" to go back to where you were. In vulgar terms it's a savepoint system for programmers.

There are hundreds of reasons to use version control, but those are two of the top benefits that I personally feel make it worthwhile. But there's countless more.

Jethro
Jun 1, 2000

I was raised on the dairy, Bitch!

oiseaux morts 1994 posted:

This isn't git's fault, imo - it will tell you that your remote branch has diverged and not allow the push. That's warning enough that something ain't right, surely?
Unless I'm misunderstanding how git works, isn't his point that a rebase will require --force regardless of whether he remembered to fetch at the start of the day? If he did a push without --force, would the error he'd get give enough information to distinguish between "you rebased that branch" and "you rebased that branch AND you forgot to fetch first"?

That being said, --force-with-lease would fix that problem. As would making sure that the repo on at least one machine has it's own local branches (i.e. treating development on multiple machines the same you would if it were multiple people)

And even if git makes it easy to screw that up, I'm still not convinced that using dropbox instead of github (or whatever) makes much sense. If you make changes on machine A, sync them to dropbox, and then can't get to dropbox from machine B for whatever reason, it seems like it would be a pain in the rear end to merge any changes you do on machine B once you can get to dropbox, whereas doing that with an external repo is what a DVCS is for.

necrotic
Aug 2, 2005
I owe my brother big time for this!

Jethro posted:

And even if git makes it easy to screw that up, I'm still not convinced that using dropbox instead of github (or whatever) makes much sense. If you make changes on machine A, sync them to dropbox, and then can't get to dropbox from machine B for whatever reason, it seems like it would be a pain in the rear end to merge any changes you do on machine B once you can get to dropbox, whereas doing that with an external repo is what a DVCS is for.

Wasn't he talking about putting a repo in dropbox? A simple "private server" that only really works for one person.

B-Nasty
May 25, 2005

I've used Mercurial since 2009 for my VC due to the better tools in Windows (at the time), but I have to say, after being forced to use Git at a new job, I like it better. Git Extensions on Windows is just as good/better than the Hg workbench and I prefer the pull --rebase workflow after getting used to it. Even with a couple guys, all the branches in a standard Hg workflow can be tough to follow sometimes.

Point being, don't fear the Git world if you're a Windows (.Net/otherwise) dev. Git Extensions will make you fall in love, and stuff like rebase and stash are great.

edit: Another tip: change your merge tool to Perforce merge tool and the diff tool to WinMerge for full greatness.

B-Nasty fucked around with this message at 03:26 on Jan 24, 2014

Woodsy Owl
Oct 27, 2004
I thought I'd just ask my question here since there's no build systems question thread. What are the common alternatives to Maven? What's the current flavor-of-the-month with regards to build systems? I've spend a day and a half trying to work my way through the Maven documentation but it is, to put it politely, inadequate. I'm approaching it as someone with no experience with build systems. Is there a build system that's more approachable? Or is there an alternative tutorial or guide for Maven rather than the one provided by Apache?

edit: I'm going to work through these Maven tutorials: http://books.sonatype.com/mvnex-book/reference/public-book.html , http://books.sonatype.com/mvnref-book/reference/public-book.html and hopefully they'll provide some more insight.

Woodsy Owl fucked around with this message at 16:22 on Jan 26, 2014

Doc Hawkins
Jun 15, 2010

Dashing? But I'm not even moving!


What language is your work in? Maven, Ant, and Gradle are the three that Java peeps use.

paberu
Jun 23, 2013

I'm trying to setup Tortoise SVN with amazon s3 and I have run into some trouble. I've installed TntDrive to have my bucket listed as a network drive, however I can't set a working repository on there. As soon as you do a commit, tortoise gives errors that the destination has been modified and it can't commit the change.

Is there a work around to this? Or will I need to set an ec2 instance to install tortoise on? How does ec2 instances work price wise, it's listed as per hour however I only need the instance running when we need to get latest or do a commit. Can it be configured to turn on via a SVN command?

Or is there a simpler method to have tortoise SVN hosted on a webhost and use amazon S3 for storage space?

Thank you in advance!

SurgicalOntologist
Jun 17, 2004

I've been googling but I can't figure this out... is there a difference between the Ubuntu repository "mercurial-git" and the PyPi package "hg-git"? The former requires the extension to be added as "hgext.git =" and the latter as "hggit =" so I think they might be different, but I can't figure out which I should use.

Jethro
Jun 1, 2000

I was raised on the dairy, Bitch!
You should use the canonical one from https://bitbucket.org/durin42/hg-git

Gazpacho
Jun 18, 2004

by Fluffdaddy
Slippery Tilde

paberu posted:

I'm trying to setup Tortoise SVN with amazon s3 and I have run into some trouble. I've installed TntDrive to have my bucket listed as a network drive, however I can't set a working repository on there. As soon as you do a commit, tortoise gives errors that the destination has been modified and it can't commit the change.

Is there a work around to this? Or will I need to set an ec2 instance to install tortoise on? How does ec2 instances work price wise, it's listed as per hour however I only need the instance running when we need to get latest or do a commit. Can it be configured to turn on via a SVN command?

Or is there a simpler method to have tortoise SVN hosted on a webhost and use amazon S3 for storage space?
I don't know how TntDrive maps S3 objects into the filesystem. If it's just a simple "S3 object = file" mapping, that probably is not sophisticated enough to run a version control system on. I would suggest that you host a Subversion server in EC2. If you want the server to be available on some fixed schedule, you can put the repository on its own EBS disk and create an image that automatically mounts that disk and runs the Subversion server on it. Then you can create alarms to launch/terminate an instance using that image.

Gazpacho fucked around with this message at 00:38 on Jan 28, 2014

paberu
Jun 23, 2013

Thanks, I think I will go with having subversion on ec2.

What is the difference between using Linux commands to install svn and something like bitnami? I have never setup SVN before. We will be using Tortoise to do our commits.

Or I can sign up with unfuddle - they offer unlimited? space for a reasonable price. Has anyone had experience using them?

paberu fucked around with this message at 14:33 on Jan 29, 2014

wwb
Aug 17, 2004

The easiest way I know if to do a SVN server is to run windows and install http://www.visualsvn.com/server/ FWIW

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
Is there a reason you selected Subversion? I would probably pick git or Mercurial running on Bitbucket if you are starting fresh today. It's free for up to 5 people and really easy to get started.

wwb
Aug 17, 2004

^^^ that man speaks the truth.

Gazpacho
Jun 18, 2004

by Fluffdaddy
Slippery Tilde

paberu posted:

Thanks, I think I will go with having subversion on ec2.

What is the difference between using Linux commands to install svn and something like bitnami? I have never setup SVN before. We will be using Tortoise to do our commits.
You keep mentioning Tortoise, but your choice of client has no relevance to how you set up a Subversion server. Furthermore, a server can't start itself on demand. It's either up all the time, or it's under the control of some monitor service that is up all the time and probably isn't designed specifically for Subversion.

I haven't touched Bitnami. They apparently allow you to start and stop an app on a fixed schedule, and you can manage costs that way.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
Visual Studio Online (formerly Team Foundation Service) isn't a bad choice, even if you just want to use it as a Git repo and ignore all of the other stuff it can do. Plus it's free.

paberu
Jun 23, 2013

gariig posted:

Is there a reason you selected Subversion? I would probably pick git or Mercurial running on Bitbucket if you are starting fresh today. It's free for up to 5 people and really easy to get started.

Sadly Mercurial doesn't like working with large files (anything over 10mb), so my choice is either Perforce or SVN for the assets.

minidracula
Dec 22, 2007

boo woo boo

paberu posted:

Sadly Mercurial doesn't like working with large files (anything over 10mb), so my choice is either Perforce or SVN for the assets.
Mercurial can work with large binary files if you use and enable the largefiles extension, but most hosted Mercurial services I'm aware of don't support it yet. To that end, please see also http://forums.somethingawful.com/showthread.php?threadid=3113983&perpage=40&pagenumber=47#post420349970

wwb
Aug 17, 2004

paberu posted:

Sadly Mercurial doesn't like working with large files (anything over 10mb), so my choice is either Perforce or SVN for the assets.

I've got a few dozen repos with loads of 10-50mb files in bitbucket here with no issues. We stop at about 50 because things tend to time out over HTTP but we could get much larger if we went over ssh.

With SVN we did store loads of 100mb+ files in said visual SVN setup. That was OK for those of us onsite on a 100-1000mb connection to the server but for the remote guy, especially the guy living on the farm on a 1.5mb connection, it was pure hell. SVN tends to get corrupted and the "fix" is typically to blow away your repo and check the entire thing out again.

paberu
Jun 23, 2013

wwb posted:

With SVN we did store loads of 100mb+ files in said visual SVN setup. That was OK for those of us onsite on a 100-1000mb connection to the server but for the remote guy, especially the guy living on the farm on a 1.5mb connection, it was pure hell. SVN tends to get corrupted and the "fix" is typically to blow away your repo and check the entire thing out again.

This might be an issue for us as we are both remote, however bitbucket doesn't recommend having a repository over 1GB. It doesn't seem like there is a neat solution for this anywhere. I do like the way bitbucket is setup otherwise. I was thinking of splitting the project into Mercurial and SVN anyway, it should work fine for a tiny team.

Most likely I will just go with unfuddle for the sake of minimizing time burned doing IT tasks.

paberu fucked around with this message at 15:18 on Jan 30, 2014

Adbot
ADBOT LOVES YOU

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...
There's also http://git-annex.branchable.com/ , depending on whether git would be a good fit for the rest of the files.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply