• 6 Posts
  • 74 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle



  • Abandon would be the best approach. A ban would just make people want to use it more.

    When twitter (now formally know as “X”) was first a thing, the only reason I joined was because private business, city services, and news agencies became a little easier to follow in one unified location. It also made it easier to reach them with quick tweets.

    Maybe the solution is to put a restriction on business, news agencies, and government services from using it?






  • Performance is great IMO, I store all my Plex media on this setup as a network share and never have any issues or slowdowns. I only use the setup as a strict NAS nothing else.

    I started with 9 drives at 12tb first, about 3 years later (mid this year) i added the second vdev to my main pool. 9 drives 20tb each.

    V-devs do not require to be the same size between v-devs, but they do require to have the same amount of drives in each.

    I have unraid and proxmox setups on other machines running independently. Plex and other software for example all access my TrueNAS over the network.

    For the TrueNAS system IMO you don’t need much “horsepower”. I run it on a 12 year old motherboard, 12gb ram and a 60gb SSD to boot. Nothing special at all. Unraid and proxmox on the other hand is where I spend the money on ram and processing power.

    My Network is gigabit and I get full speed on network transfers, looking to do 10gb in the future, but that would require 10gb NIC’s in all my PC’s and new network switches. Don’t see it effecting my TrueNAS sytem setup. Besides your network transfer is only as fast as the read/write of the drives.









  • I agree a website, a social media site, or a search engine having to pay to display a link and direct users to another site is extremely ill thought out on the Canadian governments part.

    Tbh though, google paying this “fee” may work out on its best interest.

    Now that google pays for the content it can actually just “summarize” or even just scrape the whole article and display it on its own site directly, and inject its own ads.

    This way google can pull all the ad revenue it’s self, and this way users won’t be directed to the original news site where the media companies have their own ads or paywalls trying to make money of their content.


  • Trying to rebuild this my first time I took all files appended with .1 and placed them into a BDMV folder with the correct folder structure and index files. I then did the same for all the .2 files into another DBMV folder and so on. I then removed the appended numbers to these files.

    This left me with 12 disk folders, though I could not get makemkv open any of these.

    What I think I may have been missing (which I will give a shot tomorrow) was copying over the content that did not have a appended number originally into both these folders skipping any files with the same name.


  • Feeding it into DBinfo I can see the appended playlist files (appended with .1, .2, .3 and so on) call the same numbered stream files with no appended .1, .2, .3

    If I had to guess the uploader may have uploaded the content of all six disk, and appended numbers to context that was different between the 6 to save on uploading the same file more then once?

    Where I am getting stuck in this logic is why there are 12 index files in the upload while there should only be 6 disks as listed in the .XML files.

    From what I can tell makemkv can only handel reading one index file at a time.

    Here’s a screenshot of the multiple index files.

    1000006157