sublimator wrote:Having a full checkout of each repo might not be that useful in practice. How often do you really want to indiscriminately update each and every Package from the repository?
Actually, I want all the code on my machine. I don't necessarily want all the packages loaded, but that's not quite the same thing.
What I'm looking to do, really, is have all of the code on my machine, and then turn packages on and off. I don't know how yet, but that was the desire.
sublimator wrote:Would you not also on `svn update` get new packages as well or `updates` to packages you have deleted ?
Yeah, I'm looking to get all the updates available. I wouldn't want to delete packages -- rather, I'd turn them off. Using the magic method that I haven't defined
sublimator wrote:In your `Packages` you could have cuttings from a master checkout of each package you used ( from n repositories ) and then have one single update.py script that managed. It would likely be gloriously slow as there would be a lot of redundant network usage compared to just one update command per repo.
This is what I was doing before. It is slow. Oh, boy. It also means you have to go through quite a rigmarole to start watching a new package, including changing your update script, your commit script, doing the initial checkout into the right location... That's why I wanted to move to a single operation.
I like the idea of looking for .svn files -- I was just keeping a master list, but the .svn search makes things much neater.
I actually thought last night that what I really want is multiple directories to serve as package folders; so I could set an option like this;
- Code: Select all
and have sublime monitor all three folders. Hmmm...
I wonder if you could use an __init__.py file to turn off packages? Not sure what you'd need to do -- my python-fu is weak!