Script that organizes the Google Takeout archive into one big chronological folder

Overview

PyPI Donate

Google Photos Takeout Helper

What is this for?

If you ever want to move from Google Photos to other platform/solution, your fastest choice to export all photos is Google Takeout

But when you download it, you will find yourself with hundreds of little folders with few photos and weird .json files inside. What if you want to just have one folder with all photos, in chronological order? Good luck copying all of that ๐Ÿ˜•

This script does just that - it organizes and cleans up your Takeout for you ๐Ÿ‘

It will take all of your photos from those tiny folders, set their exif and last modified, and other properties correctly, and put it in one big folder (or folders divided by a month)

How to use:

  1. Get all your photos in Google Takeout (select only Google Photos)
  2. pip install -U google-photos-takeout-helper
  3. Extract all contents from your Google Takeout to one folder
  4. Run google-photos-takeout-helper -i [INPUT TAKEOUT FOLDER] -o [OUTPUT FOLDER]

If you want your photos to be divided by a year and month, run it with --divide-to-dates flag.

How to use for dummies (non-programming people):

This script is written in Python - but if you have Windows, and don't want to bother installing it, you can download a standalone .exe ๐ŸŽ‰

  1. Go to releases->latest release->assets and download takeout-helper.exe

  2. Prepare your Takeout:

If your Takeout was divided into multiple .zips, you will need to extract them, and move their contents into one folder

  1. Open cmd, and type:
cd C:\Folder\Where\You\Downloaded\takeout-helper
takeout-helper.exe -i [C:\INPUT\TAKEOUT\FOLDER] -o [C:\OUTPUT\FOLDER]

// Ps note: Don't use the "[ ]" in the command above.

Contact/errors

If you have issues/questions, you can hit me up either by Reddit, Twitter Email: [email protected], or if you think your issue is common: Issues tab

If I helped you, you can consider donating me: https://www.paypal.me/TheLastGimbus

I spent a lot of time fixing bugs and making standalone .exe file for Windows users ๐Ÿ’– - would be super thankful for any donations

You can also send me some Bitcoin: 3GezcSsZ6TWw1ug9Q8rK44y9goWa3vTmbk, DOGE: DTKFGSzPCDxZPQQtCTyUHbpRYy6n8fSpco, or Monero: 43HorPVy1PTGVph3Qh3b6vVSfW2p3fH4ChjtiLVdLZw4Kw1vZUgCCcZSmfGCeEFq6bdeuF7zMutAcAcuuYFf8vEr6htBWTk

But, be aware if you move your photos on you Android phone...

Beware, that (99% of the times), if you move some files in Android, their creation and modification time is re-seted to current.

"Simple Gallery" app usually keeps original file creation time when moving and coping (but I don't guarantee it). It's also pretty cool and you can check it out: https://github.com/SimpleMobileTools/Simple-Gallery

What to do when you got rid of Google Photos? What are the alternatives?

  • I really recommend you using Syncthing for syncing your photos and files across devices. It does so through your local WiFi, so you're not dependend on any service or internet connection. It will also keep original file creation date and metadata, so it resolves Android issue that I mentioned before.

  • If you want something more centralized but also self-hosted, Nextcloud is a nice choice, but it's approach to photos is still not perfect. (And you need to set up your own server)

  • Guys at Photoprism are working on full Google Photos alternative, with search and AI tagging etc, but it's stil work in progress. (I will edit this when they are done, but can't promise :P )

Google has changed folder structure

Around december 2020, Google stopped putting photos in thousands of "yyyy-mm-dd" folders, and started putting them in tens of "Photos form yyyy" folders instead ๐Ÿ™„

  • If you have new "year folders" (that is, few folders named like "Photos from 2012") (+albums) - use the newest version
    • pip install -U google-photos-takeout-helper
  • If you have old "date folders" (that is, ton of folders named like "2012-06-23") - use version 1.2.0
    • pip install -U google-photos-takeout-helper==1.2.0 Old version is... well, old, and I recommend you to just request the takeout again and run agains newest version of script ๐Ÿ‘

Other Takeout projects

I used this tool to export my notes to markdown - you can then edit them with any markdown editor you like :)

https://github.com/vHanda/google-keep-exporter

This one saves them in format ready for Evernote/ClintaNotes:

https://github.com/HardFork/KeepToText

TODO (Pull Requests welcome):

  • Videos' Exif data - probably impossible to do ๐Ÿ˜•
  • Gps data: from JSON to Exif - Thank you @DalenW ๐Ÿ’–
  • Some way to handle albums - THANK YOU @bitsondatadev ๐Ÿ˜˜ ๐ŸŽ‰ ๐Ÿ’ƒ
  • Windoza standalone .exe file - Thank you, me ๐Ÿ˜˜
Comments
  • Dies on certain images

    Dies on certain images

    After running for 20+ hours, the script dies on a specific image even though it parses and displays fine. I have reproduced it with a directory of just the image. This is running release 2.0 on Ubuntu Linux 20.10

    The log looks like this -

    ~/.local/bin/google-photos-takeout-helper -i brokenimages -o testout
    Heeeere we go!
    =====================
    Fixing files metadata and creation dates...
    =====================
    brokenimages/IMG_4661(3).jpg
    Traceback (most recent call last):
      File "/home/jasontitus/.local/bin/google-photos-takeout-helper", line 8, in <module>
        sys.exit(main())
      File "/home/jasontitus/.local/lib/python3.8/site-packages/google_photos_takeout_helper/__main__.py", line 570, in main
        for_all_files_recursive(
      File "/home/jasontitus/.local/lib/python3.8/site-packages/google_photos_takeout_helper/__main__.py", line 114, in for_all_files_recursive
        file_function(file)
      File "/home/jasontitus/.local/lib/python3.8/site-packages/google_photos_takeout_helper/__main__.py", line 494, in fix_metadata
        set_creation_date_from_exif(file)
      File "/home/jasontitus/.local/lib/python3.8/site-packages/google_photos_takeout_helper/__main__.py", line 343, in set_creation_date_from_exif
        exif_dict = _piexif.load(str(file))
      File "/home/jasontitus/.local/lib/python3.8/site-packages/piexif/_load.py", line 43, in load
        exif_dict["Exif"] = exifReader.get_ifd_dict(pointer, "Exif")
      File "/home/jasontitus/.local/lib/python3.8/site-packages/piexif/_load.py", line 118, in get_ifd_dict
        tag = struct.unpack(self.endian_mark + "H",
    struct.error: unpack requires a buffer of 2 bytes
    
    The jpeginfo output for the file is -
    jpeginfo -c brokenimages/IMG_4661\(3\).jpg 
    brokenimages/IMG_4661(3).jpg 2592 x 1936 24bit Exif  N 2167440  [OK]
    

    Here is a link to the file

    opened by jasontitus 28
  • No folders named like the README instructions?

    No folders named like the README instructions?

    Hi, thought I'd give this a shot, however I noticed that after using takeout, I don't have any folders in the YYYY-MM-DD format or anything like that. Each takeout folder is mostly comprised of albums, with the closest being Photos from YYYY. I'm not sure if this is because it originally synced with Google Drive, or what.

    And also, might be unrelated, but if using macOS, can I just copy-merge the folders so I have ./Takeout/Google\ Photos/<everything copy-merged here> or should it each Takeout folder just be put into a folder (no merge) like ./folder/<Takeout [1-N]>

    opened by hnlkaitan 27
  • Refactor duplication and Add Albums

    Refactor duplication and Add Albums

    Fixes: #10 #22 #30

    This is still in draft for now just to get my current thoughts/progress out there so we can start discussions. Testing hasn't been done on a big takeout folder only a smaller subset.

    Changes/Method:

    • I've updated duplication to happen after exif fix phase and file moving phase.
    • Duplication now scans globally vs per date/album folders.
    • You no longer have to remove album folders.
    • Album folders are scanned as well in case photos that don't exist in date folders are there.
    • Once all files are in the output folder, we scan that location for duplicates.
    • Once the duplicates are removed in the output folder, album folders are scanned once again and matched with the file that exists in the output folder already if it is a duplicate. (This part still doesn't check for duplicates yet i'm still thinking through this part).
    • albums are currently going to be exported via a json file. This will be easy to update to any other variations once this code is reviewed, tested an in place.
    opened by bitsondatadev 22
  • Use hashing to determine albums.

    Use hashing to determine albums.

    With the hashing solution , we could perform a duplicate detection with the files we currently generate, with the files in albums.

    >>> image_file = open('2017-11-11/20171111_170331.jpg').read()
    >>> image_file2 = open('Saturday in Rockford/20171111_170331.jpg').read()
    >>> hashlib.md5(image_file).hexdigest() == hashlib.md5(image_file2).hexdigest()
    True
    

    If we find a match, we can create a file called albums.json or something to that effect and use the current directory to pull the album name. The json could look something like this:

    [
      {
        "Saturday in Rockford": [
          "20171111_170331.jpg",
          "mySecondIMG2.png",
           ....
        ]
      }, 
      ...
    ]
    

    As much as I would hate to introduce yet another json file that we worked hard to remove from takeout folder, this is required since we can't assume an image belongs to only one album and just stick them in there. I'm open to hearing other solutions here though.

    An alternative would be to just create album folders and allow duplicates in those folders.

    enhancement 
    opened by bitsondatadev 22
  • Issue with date?

    Issue with date?

    Takeout/Google Fotos//2003-12-31/CIMG4562.JPG
    Takeout/Google Fotos//2014-11-28/IMG_20141128_170758.jpg
    Takeout/Google Fotos//2018-05-04/IMG_20180504_144625.jpg
    Takeout/Google Fotos//2011-09-09/IMAG0180.jpg
    
    time data '2011/09/10 02:38:43' does not match format '%Y:%m:%d %H:%M:%S'
    
    ==========!!!==========
    You probably forgot to remove 'album folders' from your takeout folder
    Please do that - see README.md or --help for why
    
    Once you do this, just run it again :)
    ==========!!!==========
    

    I also removed all album folders...?!

    root@linux:/mnt/data2# ls Takeout/Google\ Fotos/
    '1970-01-01 #2'   2006-07-16-17    2009-11-06       2011-10-06       2014-09-19       2015-10-02       2016-07-11       2017-09-01       2018-11-17       2019-07-24       2020-04-25
     2003-02-28       2006-08-21       2009-11-24-25    2011-10-07       2014-09-20       2015-10-03       2016-07-12       2017-09-03       2018-11-21       2019-07-25       2020-04-29
     2003-03-01       2006-10-31       2009-11-25-27    2011-10-11       2014-09-22       2015-10-06       2016-07-14       2017-09-09       2018-11-24       2019-07-26       2020-05-02
     2003-03-01-02    2006-12-24       2009-11-28       2011-10-13       2014-09-23       2015-10-07       2016-07-22       2017-09-11       2018-11-26       2019-07-28       2020-05-04
     2003-03-02-03    2006-12-26       2009-11-29       2011-10-25       2014-09-24       2015-10-08       2016-07-26       2017-09-15       2018-11-27       2019-08-02      '2020-05-04 #2'
     2003-03-03       2006-12-29       2009-11-30       2011-11-07       2014-09-26       2015-10-11       2016-07-29       2017-09-17       2018-12-01       2019-08-07       2020-05-06
     2003-12-31       2006-12-31       2009-12-04       2011-11-23       2014-09-28       2015-10-13       2016-08-01       2017-09-21       2018-12-02       2019-08-09       2020-05-08
     2004-01-01       2007-01-04       2009-12-08       2011-11-24       2014-09-29       2015-10-14       2016-08-13       2017-09-27       2018-12-03       2019-08-17       2020-05-09
     2004-01-07       2007-01-05       2009-12-24       2011-11-26       2014-10-02       2015-10-17       2016-08-15      '2017-09-27 #2'   2018-12-06       2019-08-18       2020-05-11
     2004-04-04       2007-04-08       2009-12-25       2011-12-01       2014-10-03       2015-10-18       2016-08-16       2017-10-05       2018-12-06-07    2019-08-20       2020-05-12
     2004-05-07       2007-04-09       2009-12-26       2011-12-03       2014-10-13       2015-10-19       2016-08-17       2017-10-10       2018-12-08       2019-08-26       2020-05-15
     2004-05-08       2007-04-21       2010-01-14       2011-12-12       2014-10-16       2015-10-23      '2016-08-17 #2'   2017-10-15       2018-12-09       2019-08-31       2020-05-16
     2004-05-09       2007-04-22       2010-02-06       2011-12-31       2014-10-23       2015-10-26       2016-08-18       2017-10-17       2018-12-10       2019-09-03       2020-05-18
     2004-05-29      '2007-04-22 #2'   2010-02-11       2012-01-06       2014-10-26       2015-10-29       2016-08-22       2017-10-24       2018-12-12       2019-09-07       2020-05-20
     2004-05-30       2007-04-23      '2010-02-11 #2'  '2012-01-06 #2'   2014-10-27       2015-11-04       2016-08-24       2017-10-29       2018-12-16       2019-09-10       2020-05-22
     2004-07-17       2007-04-25       2010-02-13       2012-01-23       2014-11-01       2015-11-06       2016-08-25       2017-11-07       2018-12-17       2019-09-12       2020-05-23
     2004-07-20      '2007-04-25 #2'   2010-05-12       2012-01-26       2014-11-05       2015-11-09       2016-08-26       2017-11-08       2018-12-20       2019-09-13       2020-05-25
     2004-07-24       2007-06-27       2010-06-05       2012-02-05       2014-11-06       2015-11-10       2016-08-27      '2017-11-08 #2'   2018-12-21       2019-09-14       2020-05-28
     2004-07-29       2007-06-28       2010-06-05-06    2012-02-10       2014-11-12       2015-11-14       2016-08-28       2017-11-09       2018-12-22      '2019-09-14 #2'   2020-05-29
     2004-07-30       2007-06-30       2010-06-07       2012-02-12       2014-11-21      '2015-11-14 #2'   2016-08-29       2017-11-10       2018-12-23       2019-09-17       2020-05-30
     2004-07-30-31   '2007-06-30 #2'   2010-06-08       2012-02-15       2014-11-22       2015-11-21       2016-08-30       2017-11-14       2018-12-27       2019-09-19       2020-05-31
     2004-08-01       2007-07-01       2010-06-09       2012-02-17       2014-11-24       2015-11-22       2016-08-31       2017-11-23       2018-12-28       2019-09-21       2020-06-01
     2004-08-02       2007-07-02       2010-06-10       2012-02-18       2014-11-26       2015-11-23       2016-09-01       2017-11-30       2018-12-29       2019-09-22       2020-06-02
     2004-08-03       2007-07-03       2010-06-11       2012-02-19       2014-11-28       2015-11-24       2016-09-03       2017-12-01      '2018-12-29 #2'   2019-09-26       2020-06-05
     2004-08-04       2007-07-04       2010-06-12       2012-02-20       2014-12-02       2015-11-25       2016-09-06      '2017-12-01 #2'   2018-12-30       2019-09-27      '2020-06-05 #2'
     2004-08-05       2007-07-05       2010-06-13       2012-02-22       2014-12-03       2015-11-26       2016-09-07       2017-12-04       2019-01-01      '2019-09-27 #2'   2020-06-06
     2004-08-06      '2007-07-05 #2'   2010-06-18       2012-03-11       2014-12-04       2015-11-29       2016-09-08       2017-12-06       2019-01-02       2019-09-29       2020-06-10
     2004-08-07      '2007-07-06 #2'   2010-06-24       2012-03-12       2014-12-05       2015-11-30       2016-09-11       2017-12-08       2019-01-03       2019-09-30       2020-06-12
     2004-08-08      '2007-07-06 #3'   2010-07-03       2012-03-17       2014-12-06       2015-12-02       2016-09-12       2017-12-13       2019-01-06       2019-10-06       2020-06-13
     2004-08-09       2007-07-07       2010-07-05       2012-03-19       2014-12-07       2015-12-03       2016-09-13       2018-01-03       2019-01-07       2019-10-08       2020-06-14
    '2004-08-09 #2'  '2007-07-08 #2'   2010-07-16       2012-03-20       2014-12-11       2015-12-05       2016-09-17       2018-01-04       2019-01-08       2019-10-09       2020-06-17
     2004-08-10      '2007-07-09 #2'   2010-07-16-17    2012-03-24       2014-12-15       2015-12-06       2016-09-21       2018-01-06      '2019-01-08 #2'   2019-10-14       2020-06-19
     2004-08-11       2007-07-09-10    2010-07-22       2012-03-26       2014-12-16       2015-12-07       2016-09-24       2018-01-15       2019-01-08-09    2019-10-15       2020-06-20
     2004-08-12       2007-07-11       2010-07-23       2012-04-07       2014-12-22       2015-12-12       2016-10-01       2018-01-18       2019-01-10       2019-10-19       2020-06-24
     2004-08-13       2007-07-12       2010-07-24       2012-04-08       2014-12-24       2015-12-13       2016-10-03       2018-01-21       2019-01-11       2019-10-20       2020-06-26
     2004-08-14       2007-07-13       2010-07-25       2012-05-09       2014-12-26       2015-12-14       2016-10-04       2018-01-23       2019-01-12       2019-10-25       2020-06-29
     2004-08-15       2007-07-13-14    2010-07-26       2012-05-13       2014-12-31       2015-12-15       2016-10-05       2018-01-24       2019-01-16       2019-11-02       2020-06-30
     2004-08-16       2007-07-14-15    2010-07-28       2012-05-18       2015-01-06       2015-12-18       2016-10-14       2018-01-30       2019-01-16-17    2019-11-04       2020-07-01
     2004-08-17       2007-07-15       2010-07-29      '2012-05-26 #3'   2015-01-09       2015-12-23       2016-10-19       2018-02-01       2019-01-25       2019-11-05       2020-07-02
     2004-08-17-18    2007-07-16       2010-07-30      '2012-05-27 #2'   2015-01-11       2015-12-31       2016-10-28      '2018-02-01 #2'   2019-01-26       2019-11-07       2020-07-03
     2004-08-18       2007-07-17       2010-07-31      '2012-05-28 #2'   2015-01-12       2016-01-04       2016-11-15       2018-02-02       2019-01-27       2019-11-08       2020-07-06
     2004-08-19       2007-07-18       2010-08-01      '2012-05-29 #3'   2015-01-17       2016-01-05       2016-11-19       2018-02-04       2019-01-29       2019-11-11       2020-07-07
     2004-08-20      '2007-07-18 #2'   2010-08-02      '2012-05-29 #4'  '2015-01-17 #2'   2016-01-11       2016-11-29       2018-02-05       2019-01-31       2019-11-12       2020-07-09
    '2004-08-20 #2'   2007-07-19       2010-08-04      '2012-05-30 #2'   2015-01-18       2016-01-15       2016-11-30       2018-02-06       2019-02-01       2019-11-14      '2020-07-09 #2'
     2004-08-21      '2007-07-19 #2'   2010-08-05      '2012-05-31 #2'   2015-01-19       2016-01-25       2016-12-01       2018-02-07       2019-02-02       2019-11-15       2020-07-10
     2004-08-26       2007-07-21       2010-08-06      '2012-06-02 #2'   2015-01-20       2016-01-26       2016-12-04       2018-02-12       2019-02-06       2019-11-16       2020-07-11
     2004-08-27      '2007-07-21 #2'   2010-08-09      '2012-06-03 #2'   2015-01-30       2016-02-04       2016-12-08       2018-02-23       2019-02-09       2019-11-22       2020-07-12
     2004-08-31       2007-07-22       2010-08-14      '2012-06-04 #2'   2015-02-01       2016-02-08       2016-12-13       2018-03-12       2019-02-11       2019-11-23       2020-07-16
     2004-09-01      '2007-07-22 #2'   2010-08-26-27   '2012-06-05 #2'   2015-02-09       2016-02-25       2016-12-28       2018-03-13      '2019-02-11 #2'   2019-11-25       2020-07-18
     2004-09-08       2007-07-29      '2010-08-27 #2'  '2012-06-06 #2'   2015-02-10       2016-02-26       2016-12-30       2018-03-15       2019-02-12       2019-11-29       2020-07-21
     2004-09-11       2007-07-30       2010-09-10       2012-06-07       2015-02-15       2016-02-27       2016-12-31       2018-03-19       2019-02-13       2019-11-30       2020-07-26
     2004-09-16       2007-08-12-13    2010-09-10-11   '2012-06-09 #3'   2015-02-17       2016-02-29       2017-01-07       2018-03-20       2019-02-17       2019-12-01       2020-07-29
     2004-09-20       2007-08-24       2010-10-01      '2012-06-09 #4'   2015-02-19       2016-03-04       2017-01-09       2018-03-24       2019-02-20       2019-12-02       2020-07-31
     2004-09-21       2007-12-07       2010-10-08       2012-06-19       2015-02-20       2016-03-13      '2017-01-09 #2'   2018-03-31       2019-02-24       2019-12-05       2020-08-06
     2004-09-29       2007-12-14       2010-10-09       2012-06-23       2015-03-01       2016-03-16       2017-01-12       2018-04-04       2019-02-25       2019-12-08       2020-08-07
     2004-09-30       2007-12-31       2010-10-10       2012-06-29       2015-03-06       2016-03-17       2017-01-15       2018-04-08       2019-02-26       2019-12-11       2020-08-10
     2004-10-04      '2007-12-31 #2'   2010-10-11       2012-07-29       2015-03-07       2016-03-18       2017-01-17       2018-04-09       2019-02-27       2019-12-12       2020-08-11
     2004-10-05      '2008-02-08 #3'   2010-10-12       2012-10-03       2015-03-10       2016-03-18-19    2017-01-18      '2018-04-09 #2'   2019-02-28       2019-12-13      '2020-08-11 #2'
     2004-10-07       2008-04-07       2010-10-13       2012-10-22       2015-03-16       2016-03-20       2017-01-22       2018-04-14       2019-03-03       2019-12-14       2020-08-12
     2004-10-17       2008-04-22       2010-10-14       2012-10-26       2015-03-18       2016-03-21       2017-01-23       2018-04-15       2019-03-05       2019-12-16       2020-08-13
     2004-11-01       2008-04-23       2010-10-19       2012-11-10       2015-03-19       2016-03-22       2017-01-26       2018-04-17       2019-03-08       2019-12-17       2020-08-14
     2004-11-02       2008-04-26       2010-11-12       2012-11-14       2015-03-20       2016-03-23      '2017-01-26 #2'   2018-04-19       2019-03-09       2019-12-18       2020-08-15
     2004-11-10       2008-05-01      '2010-11-12 #2'   2012-12-06       2015-03-24       2016-03-24       2017-02-05       2018-04-21       2019-03-11       2019-12-22       2020-08-22
     2004-11-13       2008-05-02       2010-11-16       2013-02-24       2015-03-30       2016-03-25       2017-02-11       2018-04-23       2019-03-14       2019-12-23       2020-08-23
     2004-11-16       2008-05-03       2010-11-20       2013-03-01       2015-04-01       2016-03-26       2017-02-13       2018-04-24      '2019-03-14 #2'   2019-12-24       2020-08-24
     2004-12-21       2008-05-04       2010-11-27       2013-03-02      '2015-04-01 #2'   2016-03-27       2017-02-15       2018-04-27       2019-03-15       2019-12-26       2020-08-25
     2005-01-01       2008-05-14      '2010-11-27 #2'  '2013-03-02 #2'   2015-04-02       2016-03-28       2017-02-22       2018-04-30       2019-03-17       2019-12-31       2020-08-26
     2005-01-04       2008-05-17-18    2010-12-18       2013-03-03      '2015-04-02 #2'   2016-03-29       2017-03-02       2018-05-04      '2019-03-17 #2'   2020-01-01       2020-08-27
     2005-01-08       2008-05-19       2010-12-24       2013-03-04       2015-04-03       2016-03-30       2017-03-11       2018-05-05       2019-03-18       2020-01-02       2020-08-28
     2005-01-27       2008-05-20       2010-12-25       2013-03-10       2015-04-11       2016-04-01       2017-03-13       2018-05-26       2019-03-22       2020-01-03       2020-08-31
     2005-01-31       2008-05-21       2011-01-02      '2013-03-10 #2'   2015-04-15       2016-04-02       2017-03-15       2018-05-30       2019-03-25       2020-01-06       2020-09-01
     2005-02-03       2008-05-22       2011-01-08       2013-03-11       2015-04-16       2016-04-03       2017-03-18       2018-06-09       2019-03-26       2020-01-08       2020-09-03
     2005-02-04       2008-05-23       2011-01-24       2013-03-30       2015-04-20       2016-04-06       2017-03-19       2018-06-15       2019-03-27       2020-01-09       2020-09-04
     2005-02-05       2008-05-24       2011-02-04       2013-03-31       2015-04-27       2016-04-07      '2017-03-19 #2'   2018-06-20       2019-03-28       2020-01-10       2020-09-06
     2005-02-06       2008-05-26       2011-02-05       2013-04-15       2015-05-11       2016-04-08       2017-03-20       2018-06-22       2019-03-29       2020-01-13       2020-09-07
     2005-02-07       2008-05-27       2011-02-07       2013-04-22       2015-05-21       2016-04-11       2017-03-23       2018-06-23       2019-03-31       2020-01-16       2020-09-08
     2005-02-08       2008-05-28       2011-02-08       2013-04-28       2015-05-22       2016-04-13       2017-03-25       2018-06-25       2019-04-01       2020-01-21       2020-09-10
     2005-02-15       2008-05-31       2011-02-11       2013-04-29       2015-05-23       2016-04-15       2017-03-30       2018-07-01       2019-04-14       2020-01-22       2020-09-21
     2005-02-21       2008-06-13       2011-03-03       2013-05-19      '2015-05-23 #2'   2016-04-16       2017-03-31       2018-07-06       2019-04-16       2020-01-23       2020-09-22
     2005-02-22       2008-06-14       2011-03-09       2013-06-20       2015-05-24      '2016-04-16 #2'   2017-04-01       2018-07-07       2019-04-17       2020-01-31       2020-09-23
     2005-03-02       2008-07-05       2011-03-17       2013-07-20       2015-05-25       2016-04-21       2017-04-02       2018-07-11       2019-04-18       2020-02-03       2020-09-29
     2005-03-27       2008-07-06       2011-03-23       2013-07-21       2015-05-29       2016-04-24-25    2017-04-07       2018-07-13       2019-04-21       2020-02-04       2020-09-30
     2005-03-28       2008-07-07       2011-04-01       2013-07-25       2015-05-30       2016-04-26       2017-04-08       2018-07-14       2019-04-24       2020-02-08       2020-10-01
     2005-04-23       2008-07-08       2011-04-04       2013-07-26      '2015-05-30 #2'  '2016-04-26 #2'   2017-04-13       2018-07-20       2019-04-28       2020-02-11       2020-10-03
     2005-05-02       2008-07-10       2011-04-05       2013-08-04       2015-06-03       2016-04-27       2017-04-14       2018-07-22       2019-04-30       2020-02-12      '2020-10-03 #2'
     2005-05-09       2008-07-11       2011-04-16       2013-08-07      '2015-06-03 #2'   2016-04-29       2017-04-16       2018-07-24       2019-05-02       2020-02-13       2020-10-05
     2005-06-01       2008-07-12       2011-04-18       2013-10-04       2015-06-06       2016-04-30       2017-04-18       2018-07-26       2019-05-03       2020-02-15       2020-10-06
     2005-06-16       2008-07-30       2011-04-19       2013-11-01       2015-06-09       2016-05-01       2017-04-26       2018-07-27       2019-05-06       2020-02-18       2020-10-07
     2005-06-21       2008-07-31       2011-04-21       2013-11-12       2015-06-11       2016-05-03       2017-04-29       2018-07-28       2019-05-07       2020-02-19       2020-10-08
     2005-06-24       2008-08-11       2011-05-13       2013-11-18       2015-06-12      '2016-05-03 #2'   2017-05-02       2018-07-29       2019-05-09       2020-02-21       2020-10-09
     2005-06-27       2008-08-12       2011-05-14-15    2013-12-05       2015-06-16       2016-05-04       2017-05-09       2018-07-30       2019-05-11       2020-02-22       2020-10-11
     2005-06-28       2008-08-13       2011-05-16       2013-12-31       2015-06-18       2016-05-09       2017-05-11       2018-08-01       2019-05-13       2020-02-29       2020-10-12
     2005-06-30       2008-08-14       2011-05-17       2014-01-23       2015-06-22       2016-05-10       2017-05-12       2018-08-02       2019-05-16       2020-03-02      '2020-10-12 #2'
     2005-08-15       2008-08-15       2011-05-18       2014-02-18       2015-06-25       2016-05-11       2017-05-16       2018-08-05       2019-05-18       2020-03-03       2020-10-13
     2005-08-16-17    2008-09-20       2011-05-19       2014-02-22       2015-07-01       2016-05-12      '2017-05-16 #2'   2018-08-06       2019-05-20       2020-03-05       2020-10-15
     2005-08-23       2008-12-24       2011-05-20       2014-03-01       2015-07-07       2016-05-13       2017-05-20       2018-08-08      '2019-05-20 #2'   2020-03-09       2020-10-17
     2005-09-08       2008-12-25       2011-05-24       2014-03-06       2015-07-08       2016-05-17       2017-05-22       2018-08-16      '2019-05-20 #3'   2020-03-10       2020-10-18
     2006-01-13-14    2008-12-26       2011-05-26       2014-03-17       2015-07-09       2016-05-18       2017-05-23       2018-08-20       2019-05-21       2020-03-11       2020-10-19
     2006-01-14       2009-01-09       2011-06-08       2014-05-06       2015-07-13      '2016-05-18 #2'   2017-05-24       2018-08-21       2019-05-25       2020-03-15       2020-10-20
     2006-01-15      '2009-01-09 #2'   2011-06-15       2014-05-07       2015-07-14       2016-05-20       2017-05-28       2018-08-23       2019-05-28       2020-03-16       2020-10-21
     2006-01-16       2009-01-10       2011-06-22       2014-05-22       2015-07-20       2016-05-22       2017-05-29       2018-08-24       2019-05-29       2020-03-17       2020-10-22
    '2006-01-16 #2'   2009-01-11       2011-06-23       2014-05-24       2015-07-23       2016-05-26       2017-06-04       2018-08-30       2019-05-31       2020-03-18       2020-10-25
     2006-01-17       2009-05-01       2011-06-28       2014-06-09       2015-07-25       2016-05-27       2017-06-06       2018-09-03       2019-06-02       2020-03-19       2020-10-26
     2006-01-18       2009-05-02       2011-06-29       2014-06-11       2015-07-31       2016-05-28       2017-06-07       2018-09-07       2019-06-03       2020-03-20      '2020-10-26 #2'
     2006-01-19       2009-05-03       2011-06-30       2014-06-15       2015-08-02      '2016-05-28 #2'   2017-06-09       2018-09-10       2019-06-08       2020-03-21       2020-10-27
     2006-01-20       2009-06-30       2011-07-13       2014-06-22       2015-08-05       2016-05-30       2017-06-14       2018-09-11       2019-06-16       2020-03-22       2020-10-28
     2006-02-23       2009-07-19       2011-07-19       2014-06-23       2015-08-06       2016-06-01       2017-06-17       2018-09-12       2019-06-20       2020-03-23       2020-10-31
     2006-02-24       2009-08-22       2011-07-25       2014-07-04       2015-08-08       2016-06-02       2017-07-03       2018-09-16       2019-06-24       2020-03-24       2020-11-01
     2006-02-25       2009-09-12       2011-08-08       2014-07-10       2015-08-11       2016-06-04       2017-07-06       2018-09-21       2019-06-26       2020-03-24-25    2020-11-02
     2006-02-26       2009-09-13       2011-08-09       2014-07-11       2015-08-13       2016-06-06       2017-07-07       2018-09-26       2019-06-28       2020-03-26       2020-11-06
     2006-02-27      '2009-09-13 #2'   2011-08-10       2014-07-13       2015-08-14       2016-06-07       2017-07-08       2018-09-28      '2019-06-28 #2'   2020-03-27       2020-11-07
     2006-02-28       2009-09-14       2011-08-13       2014-07-18       2015-08-17       2016-06-08      '2017-07-08 #2'   2018-09-30       2019-06-29       2020-03-28       2020-11-08
     2006-03-04       2009-09-15-16    2011-08-14       2014-07-19       2015-08-19       2016-06-09       2017-07-13       2018-10-01       2019-07-01       2020-03-31       2020-11-12
     2006-04-09       2009-09-16-18    2011-08-15       2014-07-20       2015-08-23       2016-06-10       2017-07-26       2018-10-05       2019-07-06       2020-04-02       2020-11-13
     2006-04-10       2009-09-19       2011-08-16       2014-07-24       2015-08-27       2016-06-12       2017-08-05       2018-10-08       2019-07-08       2020-04-04       2020-11-14
     2006-04-16       2009-09-19-20    2011-08-17       2014-08-01       2015-08-29       2016-06-14       2017-08-06       2018-10-13       2019-07-09       2020-04-05       2020-11-15
     2006-05-16       2009-09-20       2011-08-18       2014-08-02       2015-09-01       2016-06-15       2017-08-07       2018-10-26       2019-07-13       2020-04-06       2020-11-17
     2006-05-22       2009-09-20-21    2011-08-19       2014-08-04      '2015-09-01 #2'   2016-06-16       2017-08-08       2018-10-30       2019-07-14       2020-04-07       2020-11-20
     2006-06-08       2009-09-21-22    2011-08-20       2014-08-06       2015-09-07       2016-06-17       2017-08-11       2018-10-31      '2019-07-14 #2'   2020-04-08       2020-11-22
     2006-07-10       2009-09-22-23    2011-08-21       2014-08-08       2015-09-14       2016-06-19       2017-08-12       2018-11-03       2019-07-15       2020-04-09       2020-11-23
     2006-07-11       2009-09-24       2011-08-22       2014-08-17       2015-09-15       2016-06-21       2017-08-15       2018-11-04       2019-07-16       2020-04-14       2020-11-24
    '2006-07-11 #2'   2009-09-25       2011-08-23       2014-08-19       2015-09-16       2016-06-25       2017-08-18       2018-11-06       2019-07-17       2020-04-16
     2006-07-12       2009-09-26       2011-09-09       2014-08-21       2015-09-17       2016-06-27       2017-08-20       2018-11-08       2019-07-18       2020-04-18
    '2006-07-12 #2'   2009-09-27       2011-09-16       2014-08-24       2015-09-18       2016-06-29       2017-08-21       2018-11-12       2019-07-19      '2020-04-18 #2'
     2006-07-13       2009-09-28       2011-09-22       2014-08-31       2015-09-27-28    2016-07-01       2017-08-22       2018-11-13      '2019-07-19 #2'   2020-04-20
     2006-07-14      '2009-09-28 #2'   2011-10-01       2014-09-09       2015-09-29       2016-07-05       2017-08-28      '2018-11-13 #2'   2019-07-22       2020-04-21
     2006-07-15       2009-09-29-30    2011-10-04       2014-09-10       2015-09-30       2016-07-07       2017-08-29       2018-11-15       2019-07-23       2020-04-22
    
    opened by comfreak89 20
  • Question - Help Needed

    Question - Help Needed

    Hi - I joined Github just so I can send you a message, so I hope you see this. I'm attempting to use Google's Takeout for backing up my photos. With my frustration and googling, I found your code that could help me with combining into one folder and getting the correct dates on the files, only if I understood now to use it. Besides being able to make my way through understanding VBA and HTML codes, I'm clueless with actual coding. I see your instructions, but I do not understand past downloading your file and making sure there are only numbered folder. For example, what does step 2 mean? Do I type step 4 in Terminal? If so, how exactly do I type it? Is there any chance you could send me step by step instructions for the coding illiterate? If not, I understand, but I'd figured it didn't hurt to ask as it would be AMAZING if I got it to work. I'm on a Mac running 10.10.3 and confirmed I have Python 2.7.6 installed. Thank you so much for taking the time to read this! Lindsay

    opened by Lindsay0385 20
  • HEIC Support for EXIF data

    HEIC Support for EXIF data

    Hello, is it possible to to add heic support? I used for some years an iPhone and the the photos are in heic format. But the script supports only jpg and tif.

    enhancement 
    opened by alexhellmann 18
  • Not all photos present in yyyy-mm--dd named folders

    Not all photos present in yyyy-mm--dd named folders

    The instructions indicate

    Before running this script, you need to cut out all folders that aren't dates That is, all album folders, and everything that isn't named 2016-06-16 (or with "#", they are good) See README.md or --help on why (Don't worry, your photos from albums are already in some date folder)

    This is however not true in my experience. I have exported photos from two different Google Accounts and each contains hundreds of photos that exist in custom named album folders which do not exist in any of the folders named yyyy-mm-dd.

    help wanted important 
    opened by rtadams89 17
  • Unit tests

    Unit tests

    The goal of this PR is to establish an integration test baseline to build upon in the future. The foundation to confidently move forward with future development is to avoid regressions. Since the whole tool can be thought of as a function with input parameters, deterministic behaviour and a result, it makes sense to assert the correctness on this level first before caring more about the internals. As long as this works, the rest doesn't matter too much.

    The rough plan is:

    1. Have an artificial, curated, fake google takeout in a folder (input), containing examples for all cases (including weird edge cases).
    2. Let GPTH process the folder (output).
    3. Ensure that the generated output equals a static reference (reference).

    Opening this PR as early draft to discuss the approach.

    The next steps required to start reaping the benefits of this are

    • [x] Resolve TODOs (actually implement comparison with the reference)
    • [x] Add a variety of input data (the broader the spectrum, the better).

    Future steps would be to integrate this into some kind of CI, but even without it's already very helpful as a quick test run on a local checkout is easy.

    @TheLastGimbus Regarding the second point you mentioned that you already collected test data from other users. Since sending them over to you and having them permanently stored in a public repo is still something different I also added a script to remove the actual image data from the files while preserving the EXIF metadata. It would be great if you could clean and provide what you have collected, that would probably already be a very solid test data set. The files I added so far are only to demonstrate the approach.

    tests 
    opened by sebastianludwig 14
  • JSON naming too long?

    JSON naming too long?

    I have several JSON files which names are probably too long. for exmple:

    oot@linux:/mnt/data2/Takeout/Google Fotos/2005-02-05# ls -l
    total 15111
    -rwxrwxrwx 1 root root    440 Nov 26 03:03  Metadaten.json
    -rwxrwxrwx 1 root root    752 Jan 12  2020 'Urlaub in Knaufspesch in der Schneifel (38).JP.json'
    -rwxrwxrwx 1 root root 341685 Feb  5  2005 'Urlaub in Knaufspesch in der Schneifel (38).JPG'
    -rwxrwxrwx 1 root root    752 Jan 12  2020 'Urlaub in Knaufspesch in der Schneifel (39).JP.json'
    -rwxrwxrwx 1 root root 330766 Feb  5  2005 'Urlaub in Knaufspesch in der Schneifel (39).JPG'
    -rwxrwxrwx 1 root root    752 Jan 12  2020 'Urlaub in Knaufspesch in der Schneifel (40).JP.json'
    -rwxrwxrwx 1 root root 315658 Feb  5  2005 'Urlaub in Knaufspesch in der Schneifel (40).JPG'
    -rwxrwxrwx 1 root root    752 Jul  3 07:34 'Urlaub in Knaufspesch in der Schneifel (41).JP.json'
    -rwxrwxrwx 1 root root 423738 Feb  5  2005 'Urlaub in Knaufspesch in der Schneifel (41).JPG'
    

    this way your script does not found the json files?

    bug 
    opened by comfreak89 14
  • Duplicate hashing

    Duplicate hashing

    I would like to suggest using a hash to determine duplicates of a file rather than the filesize as this can have false positives, especially dealing with the number of photos many people typically store in these services.

    md5 is faster than sha in most cases so I would recommend we use this.

    >>> image_file = open('2017-11-11/20171111_170331.jpg').read()
    >>> image_file2 = open('Saturday in Rockford/20171111_170331.jpg').read()
    >>> hashlib.md5(image_file).hexdigest() == hashlib.md5(image_file2).hexdigest()
    True
    

    Thoughts?

    opened by bitsondatadev 13
  • FileSystemException: Cannot set modification time

    FileSystemException: Cannot set modification time

    It looked like it was going to work, but didn't. My takeouts are already extracted and it spent about 10 mins examining 20K photos I have. Then it failed with what's in this screenshot.

    gpth-error

    Also interesting that it can't guess the date from hundreds of files that have the date in the file name:

    Guessing dates from files : โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ................................. 4116/20773 Can't get date on c:\temp\temp\takeout\Takeout\Google Photos\Photos from 2005\12-29-2010_019(1).JPG

    opened by wjhladik 1
  • Include trash/archive, allow

    Include trash/archive, allow "Albums only"

    There's a need to include "archive/trash" album cause they're not included in year folders

    Looks like some people want to have just the albums in their export... will need to change current "photo discovery" algos to include that case...

    opened by TheLastGimbus 0
  • Change instructions and code to process Archive album

    Change instructions and code to process Archive album

    The Archive album should be included in Takeout in addition to all the Photos from XXXX albums. Those photos are not in any of the Photos from XXXX albums. I personally keep important photos in Archive, but ones I just don't want cluttering the main space. Not sure if your code will process this album if it's present.

    opened by wjhladik 0
  • Display a warning if photos don't have exifs

    Display a warning if photos don't have exifs

    as in #133 discussion:

    • [ ] detect how many % of pictures have valid exif with valid date
    • [ ] display a waring to user if >15% doesn't
    • [ ] ...and link them to guide on how to use exiftool to fix this
    opened by TheLastGimbus 0
  • Nice message instead of exception when no photos

    Nice message instead of exception when no photos

    I've got the contents of my google photos takeout zip files extracted to c:\user\username\desktop\photos-decompress\Takeout\GooglePhotos\ (so it's a big folder full of dated subfolders which, in turn contain the images and json files).

    Running v3.0, It appears that it can't find any contents of the specified folder, and I get the following error:

    C:\Users\Username\Downloads> gpth-v3.0.0-Windows.exe -i C:\users\Username\Desktop\photos-decompress\Takeout\GooglePhotos\ -o C:\users\Username\Desktop\photos-decompress\cu
    WARNING: Script will move files from input to output - not copy

    • this is faster, and doesn't use extra space, but will break your input folder (it will be, well, empty) If you want copy instead of move, exit script (ctrl-c) and use --copy flag Otherwise, press enter to agree with that

    Okay, running... searching for everything in input folder... Found 0 photos/videos in input folder Unhandled exception: Unsupported operation: Infinity or NaN toInt #0 FillingBar._render (package:console_bars/src/filling_bar.dart) #1 new FillingBar (package:console_bars/src/filling_bar.dart:73) #2 main (file:///d:/a/googlephotostakeouthelper/googlephotostakeouthelper/bin/gpth.dart:220) C:\Users\username\Downloads>

    opened by preppietechie 9
Releases(v3.2.0)
A tool to maintain an archive/mirror of your Google Photos library for backup purposes.

Google Photos Archiver Updated Instructions 8/9/2021 Version 2.0.6 Instructions: Download the script (exe or python script listed below) Follow the in

Nick Dawson 116 Jan 3, 2023
With this simple py script you will be able to get all the .png from a folder and generate a yml for Oraxen

Oraxen-item-to-yml With this simple py script you will be able to get all the .png from a folder and generate a yml for Oraxen How to use Install the

Akex 1 Dec 29, 2021
Archive of the image generator stuff from my API

alex_api_archive Archive of the image generator stuff from my API FAQ Q: Why? A: Because I am removing these components from the API Q: How do I run i

AlexFlipnote 26 Nov 17, 2022
This script is for photographers to do timeslice with one click.

One Click TimeSlice Tool What is this for This is for photographers who want to create TimeSlice pictures without installing PS plugins. Before using

Xi Zhao 13 Sep 23, 2022
An python script to convert images to upscaled versions made out of one-colour emojis.

ABOUT This is an python script to convert png, jpg and gif(output isnt animated :( ) images to scaled versions made out of one-colour emojis. Please n

null 0 Oct 19, 2022
Simple Python / ImageMagick script to package images into WAD3s for use as GoldSrc textures.

WADs Out For [The] Ladies Simple Python / ImageMagick script to package images into WAD3s for use as GoldSrc textures. Development mostly focused on L

null 5 Apr 9, 2022
Script For Importing Image sequences into scrap mechanic via blueprints

To use dowload and extract "video makes.zip" Python has to be installed https://www.python.org/ (may not work on version lower than 3.9) Has to be run

null 2 Oct 30, 2021
Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.

img2dataset Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine. Also supports

Romain Beaumont 1.4k Jan 1, 2023
A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for quickly creating new images from the one assigned to the field.

django-versatileimagefield A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for creat

Jonathan Ellenberger 490 Dec 13, 2022
This Web App lets you convert your Normal Image to a SKETCHED one within a minute

This Web App lets you convert your Normal Image to a SKETCHED one within a minute

Avinash M 25 Nov 10, 2022
impy is an all-in-one image analysis library, equipped with parallel processing, GPU support, GUI based tools and so on.

impy is All You Need in Image Analysis impy is an all-in-one image analysis library, equipped with parallel processing, GPU support, GUI based tools a

null 24 Dec 20, 2022
A quick and dirty QT Statusbar implementation for grabbing GIFs from Tenor, since there is no offical or unofficial one I found. This was intended for use under Linux, however it was also functional enough on MacOS.

Statusbar-TenorGIF App for Linux A quick and dirty QT Statusbar implementation for grabbing GIFs from Tenor, since there is no offical one and I didnt

Luigi DaVinci 1 Nov 1, 2021
๐Ÿ’ฏ Watermark your images with one line of command

Watermarker ?? Watermark your images with one line of command ?? $ pip3 install

Orhan Emre Dikicigil 3 May 1, 2022
Bringing vtk.js into Dash and Python

Dash VTK Dash VTK lets you integrate the vtk.js visualization pipeline directly into your Dash app. It is powered by react-vtk-js. Docs Demo Explorer

Plotly 88 Nov 29, 2022
python app to turn a photograph into a cartoon

Draw This. Draw This is a polaroid camera that draws cartoons. You point, and shoot - and out pops a cartoon; the camera's best interpretation of what

Dan Macnish 2k Dec 19, 2022
Conversion of Image, video, text into ASCII format

asciju Python package that converts image to ascii Free software: MIT license

Aju Tamang 11 Aug 22, 2022
AutoGiphyMovie lets you search giphy for gifs, converts them to videos, attach a soundtrack and stitches it all together into a movie!

AutoGiphyMovie lets you search giphy for gifs, converts them to videos, attach a soundtrack and stitches it all together into a movie!

Satya Mohapatra 18 Nov 13, 2022
A Icon Maker GUI Made - Convert your image into icon ( .ico format ).

Icon-Maker-GUI A Icon Maker GUI Made Using Python 3.9.0 . It will take any image and convert it to ICO file, for web site favicon or Windows applicati

Insanecodes 12 Dec 15, 2021
Convert any image into greyscale ASCII art.

Image-to-ASCII Convert any image into greyscale ASCII art.

Ben Smith 12 Jan 15, 2022