<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Tue, Jul 25, 2017 at 3:57 PM, Ben Boeckel <span dir="ltr"><<a href="mailto:ben.boeckel@kitware.com" target="_blank">ben.boeckel@kitware.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class="">On Tue, Jul 25, 2017 at 15:38:40 -0400, Marcus D. Hanwell wrote:<br>
> Sounds good, we will probably use this approach for Tomviz, but likely on<br>
> GitHub with their 1GB limit on LFS data size. We haven't gotten to it yet,<br>
> so certainly interested in new findings for an already quite large data<br>
> repository.<br>
<br>
</span>Note that it sounds like there are also monthly bandwidth limits on<br>
Github LFS repos. Dan LaManna says there's a repo on Github he's working<br>
on that can't be cloned because its LFS quota has been reached. I don't<br>
know what that limit is though.<br></blockquote><div><br></div><div>It is 1GB of bandwidth too, pretty low, then at $5 I think you get a data pack that gives you 50GB. </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
We also need to figure out LFS data mirroring between our repos and<br>
Github (both directions) at some point for the repos that need it.<br>
<span class="HOEnZb"><font color="#888888"><br></font></span></blockquote><div>Certainly, for those that want to do it, it seemed pretty easy in my tests from a local perspective - pushing to our Gitlab versus GitHub looked identical.</div><div><br></div><div>Marcus </div></div></div></div>