Skip to main content

Home/ learning-git/ Group items tagged #git

Rss Feed Group items tagged

Daniel Jomphe

Setting up your Git repositories for open source projects at GitHub « Insoshi... - 0 views

  • In setting up the repositories for Insoshi, I’ve applied the version control experience I gained at Discover, where I was technical lead for the software configuration management (SCM) team.
  • Except for that interaction, everyone works within their own repository and on their own schedule. There’s no process waiting to be completed that blocks you from moving on to whatever you need/want to do next. And you’re not forcing anyone to drop what they’re doing to right now to handle your request.
  • One of the major benefits of a distributed version control system like Git is that each repository is on an equal footing; in particular, we would like every fork to have the same master branch, so that if the “official” Insoshi repository should ever be lost there would be plenty of redundant backups.
  • ...14 more annotations...
  • it’s a bad idea in general to work on the master branch; experienced Git users typically work on separate development branches and then merge those branches into master when they’re done
  • Your local repository: The “right” way Keeping the big picture in mind, here are the commands I’ve run to set up my local repository (using the GitHub id long): $ git clone git://github.com/insoshi/insoshi.git $ cd insoshi $ git branch --track edge origin/edge $ git branch long edge $ git checkout long $ git remote add long git@github.com:long/insoshi.git $ git fetch long $ git push long long:refs/heads/long $ git config branch.long.remote long $ git config branch.long.merge refs/heads/long
  • You should note that the Git URL for the clone references the official Insoshi repository and not the URL of my own fork
  • Insoshi also has an ‘edge’ branch for changes that we want to make public but may require a bit more polishing before we’d consider them production-ready (in the past this has included migrating to Rails 2.1 and Sphinx/Ultrasphinx).  Our typical development lifecycle looks something like development -> edge -> master
  • I’m resisting the temptation to immediately start working on the local ‘master’ and ‘edge’ branches. I want to keep those in sync with the official Insoshi repository. I’ll keep my changes separate by creating a new branch ‘long’ that’s based off edge and checking it out
  • I’m starting my changes off of ‘edge’ since that contains all the latest updates and any contribution I submit a pull request for will be merged first into the official Insoshi ‘edge’ branch to allow for public testing before it’s merged into the ‘master’.
  • I’m finally adding the remote reference to my fork on GitHub
  • We should run a fetch immediately in order to sync up the local repository with the fork
  • I’m pushing up my new local branch up to my fork. Since it’ll be a new branch on the remote end, I need to fully specify the remote refspec
  • Now that the new branch is up on my fork, I want to set the branch configuration to track it
  • Setting the remote lets me just simply use $ git push to push changes on my development branch up to my fork
  • I’ve got a shell script for you.
  • The extra work is worth the effort, because with this configuration My changes will be easily identifiable in my named branch I can easily get updates from the main Insoshi repository Any updates I’ve pulled into master and edge are automatically pushed up to my fork on GitHub The last one is a bonus because the default refspec for remotes is refs/heads/*:refs/heads/*. This means that the simple ‘git push’ command will push up changes for all local branches that have a matching branch on the remote. And if I make it a point to pull in updates to my local master and edge but not work directly on them, my fork will match up with the official repository.
  • So what is the benefit of all this to open source projects like Insoshi? The easier it is for the contributor to pull in updates, the more likely it will be that the pull request will be for code that merges easily with the latest releases (with few conflicts) You can tell if someone is pulling updates by looking at their master and edge branches and seeing if they match up with the latest branches on the main repository By getting contributors in the habit of working on branches, you’re going to get better organized code contributions Basically, the less effort that’s required to bring in code via a pull request, the sooner it can be added to the project release. And at the end of the day, that’s really what it’s all about.
Daniel Jomphe

[msysGit] Re: CRLF problems with Git on Win32 - 0 views

  • I'd say "could be helped". For the msysgit development, for example, we do _not_ want to have core.autocrlf = true but prefer to preserve the Unix line ending even when working on Windows. We have only few Windows-specific files that are committed with CRLF. We _know_ the problem and we explicitly handle it. I believe, best would be if a line ending policy could be configured for a project. Then, the decision could be made once for the project and should be enforced on all clones. But currently git has no concept for this.
  • A sound policy for "real cross-platform" is that CRLF must never enter the repository unless git detects a file as binary, or a file is explicitly listed in .gitattributes. It doesn't really matter if Windows users check out files with CRLF or LF. It only matters that they'll never commit a file with CRLF. Note, the same is true for Unix users. People could send code by email or copy source files from Windows to Unix machines. Then, CRLF would enter the repo on Unix. So the least that should be set for this type of projects on any OS is core.autocrlf = input. On Windows, core.autocrlf = true is probably more natural. I like Linus' idea of "warn" or Gregory's "fail". Would "warn/fail" be the default on Unix, too? Then Unix users would also be forced to make an explicit choice. Maybe some day they want to check out their project on Windows and they should be prepared now. For typical files, the warning (or error) would never trigger. But maybe one day they copy a file from a Windows machine and forget to run dos2unix. In this case, git would warn them unless they set "core.autocrlf = false".
  • I'm asking the last question because every Unix developer should think about the option, too. Neither Unix or Windows are causing the problem alone. It's the combination in a cross-platform project. Git could ensure that any repository is in principal prepared for cross-platform, unless explicitly told not to do so. So, would you, as Linux developers, like to have (or accept) "warn/fail" as your default? This would make things easy for the msysgit project: No Windows specific configuration; just official git.
  • ...3 more annotations...
  • But core.autocrlf = true has a slight danger of data corruption. AFAIK, git's binary detection checks the first "few" bytes (with few = 8000). This may be sufficient in most case, but I already met a file that was wrongly classified. (A File format that starts with a large ASCII header and has chunks of binary data attached later.)
  • I believe the main question is which type of projects we would like to support by our default. For real cross-platform projects that will be checked out on Windows and Unix we should choose "core.autocrlf true" as our default. But if our default are native Windows projects that will never be checked out on Unix, then we should not set core.autocrlf by default.
  • If the primary target is native Windows projects that wants CRLF in the work tree, you could still set core.autocrlf. Your checkouts will be with CRLF. And someday perhaps somebody may offer porting that to UNIX and his checkout will be without CR. So wouldn't the categorization be more like this? - "real cross-platform" would want core.autocrlf = true; - "native Windows" can work either way;
  •  
    I'd say "could be helped". For the msysgit development, for example, we do _not_ want to have core.autocrlf = true but prefer to preserve the Unix line ending even when working on Windows. We have only few Windows-specific files that are committed with CRLF. We _know_ the problem and we explicitly handle it. I believe, best would be if a line ending policy could be configured for a project. Then, the decision could be made once for the project and should be enforced on all clones. But currently git has no concept for this. A sound policy for "real cross-platform" is that CRLF must never enter the repository unless git detects a file as binary, or a file is explicitly listed in .gitattributes. It doesn't really matter if Windows users check out files with CRLF or LF. It only matters that they'll never commit a file with CRLF. Note, the same is true for Unix users. People could send code by email or copy source files from Windows to Unix machines. Then, CRLF would enter the repo on Unix. So the least that should be set for this type of projects on any OS is core.autocrlf = input. On Windows, core.autocrlf = true is probably more natural. I like Linus' idea of "warn" or Gregory's "fail". Would "warn/fail" be the default on Unix, too? Then Unix users would also be forced to make an explicit choice. Maybe some day they want to check out their project on Windows and they should be prepared now. For typical files, the warning (or error) would never trigger. But maybe one day they copy a file from a Windows machine and forget to run dos2unix. In this case, git would warn them unless they set "core.autocrlf = false". I'm asking the last question because every Unix developer should think about the option, too. Neither Unix or Windows are causing the problem alone. It's the combination in a cross-platform project. Git could ensure that any repository is in principal prepared for cross-platform, unless explicitly told not to do so. So, would you, as Linux developers, like to ha
Daniel Jomphe

'Re: clarification on git, central repositories and commit access lists' - MARC - 0 views

  • Another option is to look at git-svnserver which would allow a git repository backbone, but could talk svn over the wire which these tools could use...
Daniel Jomphe

'Re: clarification on git, central repositories and commit access lists' - MARC - 0 views

  • Btw, to see this in another light: as an example of a git tree that merges those same branches, but *before* they are ready, just look at the -mm tree. Now, Andrew actually ends up exposing the end result not as a git tree, but as patches, but what he actually *does* is to: - get my git tree - merge in about 30-40 other git trees from other developers (not all of which necessarily have actual development on them at any particular time) - then merge in his own patch list - expose it all as the -mm patch series So this is an example of how you actually have a totally separate, and still fairly central (the -mm tree is certainly now unknown outside of the core developer circles) tree, and where git is a big part in making a central "experimental" tree that is separate from my own central "development" tree. Also, it's an example of why centralization is bad: different people and entities have different intents. You could *not* reasonably do something like this with a centralized SCM like SVN.
« First ‹ Previous 41 - 60 of 82 Next › Last »
Showing 20 items per page