how do i safely turn my spare trashcan PC into an always-on git receptacle

Discussion in 'Tech Heads' started by Agrul, Aug 3, 2017.

  1. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    + most of this stuff isnt binary files u scrub

    its img + model files

    & one day i will enter the matrix and b able 2 read them like a the open book that all ur pitiable goopy brains alrdy loook like 2 me
     
  2. Utumno

    Utumno Administrator Staff Member

    Post Count:
    35,392
    img files basically same deal, not diffable, not text, not good for git. bad. really bad.

    i'm mostly just yanking your chain though for real. i'm not a release engineer, and tbh this very same question came up at work when one of our devs was asking me how he could handle versioning for large media files and i basically said "welp"

    if you find a good answer i'd love to know it, i feel like i should know this by now if only through osmosis but i must have tuned out any useful knowledge while interacting w/release managers/engineers through the years.

    errr, i know like... artifactory and maven are things. are they useful things for stuff like this? (basically just pulling stuff out of my ass at this point)
     
  3. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    i dont know anything about artifactory or maven so no idea but i will look into it. bear in mind i only recently learned how to use git. i have an ok understanding of it from playing w/ it + reading about 66% of the 'git pro' book but im still not super savvy

    anyway i honestly dont see any flaws w/ my plan to turn my 2ndary pc bucket into a git repo that has enough room for large-ish media files, and i think it will be a good learning experience 2 figure out how to set up my own home server. i would see problems w/ it if the files were so big that individually that individual pull/pushes would be time-consuming, but that is unlikely to be the case. certainly it hasn't been so far, and if individual pull/pushes ever become time-consuming i will probably take that as a sign that i've fucked up somwhere along the line, as genreally my pushes/pulls shouldnt involve me pushing every file in the project at once, which is the only situation in which the downloads/uploads should get clogged due to size

    the only issue i have is that in a year or five i will use up the 1-10GB that bitbucket makes available for a repo, and a homegrown server fixes that problem. it will be somewhat less robust than bitbucket since im not a company w/ a million servers living in a big industrial server room, but with git that doesnt really matter. when my trashcan server inevitably dies i will stil be golden bc git maintains full copies of the repo everywhere, so ill have full backups on my 1-2 laptops & my nice desktop which i will be able to use to repopulate a new server once i buy new equipment for a new one

    in short i do njot understand the attitude that git is only good for code. i think that is a very poorly informed opinion
     
  4. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    i mean a less fancy alternative for large-media/3dmodel-file "VC" would be for me to just maintain an external HDD and constantly plug it in/take it out to keep a backup whenever i do shit

    but that's so much less convenient than just typing git push/git pull whenever i finish/start working on a project u know
     
  5. Utumno

    Utumno Administrator Staff Member

    Post Count:
    35,392
    i'm curious 2 see how this works out for you and if your trashcan pc holds up.

    if it seems too flaky and you decide you'd like something more reliable/manageable you could also consider dropping some coin on a synology NAS, they have all sorts of plugins and shit and i bet they already have a git one (or at least some method of installing git).

    i used to build my own home NAS but upon growing older + having less time, I've grown 2 love my synology NAS which i use for storage and some streaming. been running for years trouble-free.
     
  6. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599
    well it doesn't automatically maintain those same files everywhere. You have to remember to update your repository and sync the updates to the local machine.
     
  7. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599
    What does this mean? In the Linux world there is very little difference between clients and servers. In Windows it's more about the number of connections to the server you can make.

    Basically, a server is a centralized machine running a/your application, such that multiple users connect to it, so you only have to manage the services at a single point versus many.
     
  8. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    im moving lsowly on this because bitbucket will last me a while. 1 GB isnt gonna disappear overnight so my incentives 2 move rapidly are limited

    but from what little ive bothered to try to figure out so far, solayce, i believe the thnig i dont know how to do/have never done b4 is to set up a daemonized process on my 'server' box that will always-on listen for incoming connection requests that can in turn be redirected to a git bash prompt. that & making sure the necessary ports r open, and also doing that stuff w/o somehow exposing myself to unwanted outside traffic. i know essentially nothing about networking so i'm sort've flying by the seat of my pants even trying to articulate what it is i dont know, but i think that's it

    the only things ive done so far r:

    A) get mad at my intended-as-server box bc i wanted to use its ubuntu 2ndary boot to run the server but ubuntu was freezing whenever i tried to strat connecting to wifi, so i overwrite both my windows & ubuntu instlals w/ a fresh ubuntu install. gotta fix drivers and config shit for it to wifi properly still but it at least isnt freezing like b4

    B) install openSSH server 4 windows on my nice desktop to practice a bit & i tried connecting from this pc to this pc w/ this pc's ip address. i got a connect refused msg & have yet to try to figure out why (closed port? did i confuse router ip address & pc ip address? other things???1?)

    that is my progress thus far good sires
     
  9. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    i have completely changed my mind on this

    i think

    i looked at amazon's s3 bucket pricing, and at no more than 3 cents a GB per month for any amount of storage i will ever use, i may as well just use amazon s3 for my always-on, high-storage-amount git server

    this idea has obviously occurred to other ppl and seems ot have been received well: https://fancybeans.com/2012/08/24/how-to-use-s3-as-a-private-git-repository/
     
  10. Utumno

    Utumno Administrator Staff Member

    Post Count:
    35,392
    ya that definitely seems like a lot less work

    the only down side would be transfer times if you're regularly having to pull/push gigs of data down. that could be p slow depending on ur intertubes bandwidth
     
  11. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    yeah but i dont think it should generally be gigs pushed at a time

    a single commit even on the art side of the project shouldnt generally consist of altering more than a few .blend files and a handful of png's etc for texturing

    my push/pull sizes will probably be in the MBs rather than the KBs u normally see with code files but i think that's still plenty small that even weaksauce american intertubenets will be ok w/ it
     
  12. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    wow that was really painless to set up, couldnt have taken more than 25 mins to puzzle my from no s3 bucket to s3 bucket w/ git repo in it and first successful push / pull / clone tests
     
  13. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599
    Looks like you've gone a different direction, but in case you get the itch again, here's what you need to do. Without research, I can only explain how I would approach your goal in generic terms, in linux.

    1) As far as daemonizing processes, that's not something you typically do. They come like that. It's called services. They usually have a config file of some kind, to play with non-standard/basic/default settings, then you either start or restart the service. Installing the package (I think) does the "daemonizing" for you.

    2) As far as what ports are open, firewalls control that. There's a command in Windows to show stuff currently running and which ports they're listening on: netstat. See if there's an equivalent; it might be the same thing as this might be a low level TCP/IP command. The firewall I am aware of in Linux, is called iptables. To confirm open ports, you can also use the telnet command from another machine, to see if the port responds: telnet <ip> <port>

    3) Coming from Windows, I've rarely setup an SSH server, but I think SSH runs on port 22 by default.

    4) If you are used to seeing a GUI git repo, then you are using your browser. Your browser opens port 80 by default for http connections, and 443 for https connections. That said, git probably runs on other ports, so you will need to make sure iptables allow ports 80, 443, and whatever git uses. Also, since you probably haven't set up DNS on your internal network, you will need to statically assign, and connect with, all IP addresses.
     
  14. Utumno

    Utumno Administrator Staff Member

    Post Count:
    35,392
    btw, if your usage is under 50 gigs, I think you could have just used AWS Codecommit since you already went the S3 route, and you wouldn't even need that java client either. I think Codecommit is free for the first 5 users up to 50 gigs.
     
  15. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
    utumno:

    ive never heard of AWS Codecommit, that's neat. my usage will likely be under 50 gigs for the foreseeable future, but that mostly depends on how zany my 3d modeling gets & how much relative time i spend on the blender repo vs the C# code repo. im not too worried about it, anyway; at 2.5 cents per GB s3 is fine. plus i need to be comfortable with s3 buckets for work, so this is a 2 birds one stone sorta thing. i cant practice emr like this, but i imagnie at least some of s3 w/ ec2 instances should translate (i will eventually toy around w/ some of the smaller EC2 GPU instances for fun)

    ssssssssolayce:

    thanks pal. i do still want to set up my own git server eventually, if only for the sake of learning how to do it. i shall bookmark ur words of wisdom

    i have one small CRITICISM tho. i think services is primarily a windosw term for daemonized processes, the reading material i was readin said that i think. so suck a duck yo im the networking best

    i bought a cheapo used textbook on networking fundamentals so i can be more comfortable w/ the basic ideas in a yaers time or so and know what TCP/IP mean beyond just eexpanding their acronyms into words
     
    Last edited: Aug 22, 2017 at 2:56 AM
  16. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599
    :giggle:
     
  17. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599
  18. Agrul

    Agrul TZT Neckbeard Lord

    Post Count:
    42,859
  19. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599
    true, but your almighty Ubuntu command is called ...
    ...
    ...
    ...
    wait
    for
    it
    ...
    ...
    ...
    what was it again...?
    oh
    yeah
    ...
    ...
    ...
    SERVICES ::lookatme::
     
  20. Solayce

    Solayce Would you like some making **** BERSERKER!!! Staff Member

    Post Count:
    21,599