|
Wow, Google Code has had Mercurial support since the end of April: http://googlecode.blogspot.com/2009/04/mercurial-support-for-project-hosting.html I know I'm late to the party but it caught me by surprise. I'm probably not the only one who didn't know. I wish they supported Git but it's cool to see them supporting any DVCS.
|
# ¿ Sep 13, 2009 07:24 |
|
|
# ¿ May 8, 2024 02:03 |
|
pseudorandom name posted:They ripped out Mercurial's native storage implementation and made it a wrapper around BigTable, which isn't something you can easily do with git. I had read through their analysis (http://code.google.com/p/support/wiki/DVCSAnalysis) and I totally understood why they went with Mercurial over Git. I'm not saying they *should* have gone with Git, just that I wish they had gone with Git since I like both Git and Google Code. Strangely enough they didn't seem to think porting the file store to BigTable was any harder with either.
|
# ¿ Sep 15, 2009 03:22 |
|
Ferg posted:Despite it's tendency to get bogged down from time to time, I actually prefer GitHub over Google Code personally. It targets more of the atmosphere I would like from a public code repository than Google Code does. Yeah, GitHub is pretty awesome. I just wish they would stop using so much Flash on their site (actually I wish they would stop using Flash completely). For me, the big thing Google Code has going for it is that it's Google and it's safe to say they're not really going anywhere. I'm not so sure about GitHub yet, but who knows SourceForge is still around and I have no idea what keeps them afloat.
|
# ¿ Sep 15, 2009 06:10 |
|
CHRISTS FOR SALE posted:Does anyone else here use git for large files? It does take a while for the repos to push to the backup server, but I've been using it to keep my Logic Pro tracks version-controlled. It's also AWESOME for collaboration, because we can branch the project and I could work off my own branch while my collaborator(s) can work off theirs simultaneously. I chose git based on its ability to make branches and merge them easily, because working collaboratively I have a feeling that such functionality will be crucial to our workflow. Where I work we have a beast of a repo containing many bad decisions that we recently converted to Git. The bare repo is just under 5 GB and we have many files in there that are over 100 MB and Git still stays quick (previously it was in SVN and I was interacting via Git-SVN and it was still fast). As long as you're using a good transport mechanism (HTTP only got "good" in recent versions of Git) transfers will be pretty fast. edit: I should add that we use it for code mostly and our big files are 3D models and various libs that we need to compile our code. Our binary files don't change very often. It looks like you'll be working with binary files exclusively. Git will try to be smart about delta compressing changes but I wouldn't necessarily expect the same performance as people using it mainly for code. I'm pretty sure it'll work fine though. samiamwork fucked around with this message at 06:03 on Feb 25, 2011 |
# ¿ Feb 25, 2011 05:52 |
|
Dooey posted:I accidentally entered "git config --global core.autocrlf = false" You probably now have a wonky line in your global config file. Whenever I do stuff like that I just open the config file and edit the bad line by hand: code:
|
# ¿ Jun 1, 2011 06:04 |