Monday, January 18, 2021

Managing photos

Like many people, I have many photos which I wish to keep and have access to on my computer and other devices. This is a short account of my approach to this complex area.

I have been interested in still photography since I was a teenager. I bought a Leica 1 camera then, and I have had a camera with me ever since. I moved to a digital camera as soon these became available for consumers. I have upgraded  at intervals.

I became interested in videography in 1995, upgrading at intervals. I now use professional video editing software (EDIUS) to edit then make DVDs and Blu-ray discs from the clips I take. I plan a separate Blog about this.Here, I focus on digital still photos.

My photos came from the following sources:

    Analogue photos held in photo albums
    Analogue 35 mm slides
    Loose analogue photographs
    Digital photos taken with my digital camera
    Digital photos taken with my smart phone
    Digital photos from WhatsApp postings
    Digital photo sent me in emails and by other means

 I have used computers since 1986 and digital photos since 1988. There were many issues I had to consider then: how to choose and use a digital format and databases that were easily accessible to me and that others might have access to. I wanted to have them in a form which could be easily viewed and shared, with the location and date taken as part of each record. I also wanted to name people shown in the photos and provide, in captions, some information about each picture, including events and activities when they were taken.

Digitizing analogue photos was a major decision. It took me a very long time to do this using a semiprofessional scanner both for the black-and-white and colour prints and many 35 mm colour slides. 

Storing the digital images on a computer. From the start, I decided to store each photo in a directory named with the day on which it was taken. These directories are held in directories by month and above that by year and by decade. Because the storage initially built within a Microsoft website on my computer, and because I kept the structure even after I no longer used a web for my new photos, it kept its original structure. In this web, I also kept a variety of other graphics in their own directory structure.
The main reason why these decisions were made was because I wanted the structure to be expandable and easily maintained.
Digitized older analogue photos were added to the directory structure according to the best guess I could make of the date on which they were taken.

For example, current photos are in the directory structure:

"C:\Inetpub\wwwroot\Pictures\Photos\2021-2030\2021\2021_01\2021_01_17"

Deciding a digital storage format. I chose the JPG file structure for each photo. Nearly all my photos are JPGs. My views on storing all the photos have changed as software evolved. Initially, I set up an HTML web-based digital photo archive (as shown in the directory structure above). I learnt to write in HTML and used commercial software to edit digital photos, where they would be improved. The 'internal' web has not publicly posted. I added photos and edited the 'internal web' for several years but stopped doing this when Adobe introduced Elements. I added all the photos to this database and I still use it as the only database for photos, running in my computer.  I now have over 60,000 photos in over 4,000 folders, using 150 GB of storage.

Storing photos in the cloud. I knew that cloud storage was 'the way forward' and I used online commercial sites to keep copies of my photos. Unfortunately, many of the initial resources to do this closed down. These photos were unceremoniously deleted from the cloud. It showed the value of keeping a master database under my control and home and I still do this.

Google Photos appeared to be the best option for me after the older cloud storage resources closed or became difficult to use. I have bought 200 GB space to contain all my photos and other graphic files, but not videos, DVDs and Blu-ray videos, which are only stored locally. 

Location data. Some modern digital cameras and most smart phones can provide GPS coordinates for each photo. Software will take GPS coordinates from a smart phone and add it automatically to photos taken with some digital cameras that do not have GPS sensors. However, I have found that smartphone GPS data are not very accurate and they are sometimes grossly inaccurate.
For this reason, I now take an accurate GPS device, a multi-band Garmin GPSMAP 66sr, with my digital camera. This provides GPX files which I can use to provide accurate GPS coordinates for each photo. I use RoboGEO to do this. This approach adds an extra layer of complexity and time to transferring photos to my computer and hence to Google photos, but I try to do it each time I bring my camera with its photos and the GPS device back to the computer.

Adding captions to photos. This is a manual process helped by facial recognition which is now available in both Elements and in Google photos. It does require careful checking and updating to make facial recognition accurate.

Synchronizing photos on my computer in Elements and at Google Photos. This again is a manual process, involving uploading and downloading and transferring captions by cut and paste.

Sharing photos with family members and other people. I use Google Photos to do this and it works well. I do not include photos in emails or send them to recipients. Recipients are sent a link and they can look at them online. Not uploading photos to each email recipients was important when upload speeds were slow.

Deleting photos. I make a point of deleting photos that are duplicates or unsatisfactory but I do not classify photographs or assess their relative importance. I store them all.

Backing up photos. I ensure that all new files added to my computer are backed up at least once a day to four separate hard drives, used in turn, on my computer, and to a local NAS. My main system drive is cloned to a backup drive and stored elsewhere, once a month.

Conclusions. Everyone who takes photos has had to assess how to deal with the many photos we are now taking. I suspect that some are swamped, some simply delete them and others let them accumulate on SD cards. If they are to be of any use to others, clearly a procedure, such as I have followed, will be necessary.

I hope that in future, photography will become less technical and easier, with automatic transfers of photos to an owner's computer and to cloud resources. On day, photos will analyze the content of photos so that components, in addition to people, are defined and stored with each photo. When plants and wild animals are photographed, they might automatically become part of global resources that monitor them. I do this manually using iSpot, but again, it is another  technical step that takes time and skill.

I should be interested to learn how other people handle this changing and complex area of the modern digital world we live in. It is not a trivial task!

We are only at the beginning of processes to record and share digital graphical representation of the world around us for personal use and to contribute to wider resources.

Written on Monday 18/01/2010 in Heol Senni


Friday, June 25, 2010

Computing in the cloud and securing my private data

Since my computers are all connected to the Internet and there are many benefits in having my 'public' computing activities hosted or stored on public (free) computers, I have been assessing whether I can keep some  of my personal or 'sensitive' files in the 'cloud'. I have concluded that none of my sensitive data should ever be put in the'cloud'. The main reason for this is that, once these bits and bytes have been sent through my ISP's servers, they are in effect 'public' or at any rate potentially so. Essentially, I have lost control of where or to whom they are sent. I have always considered my email, blogs, webs and so on as essentially open to all, or at the very least to external organizations that demand these files from ISPs and 'cloud' resource managers. That way, I hope that I have not put myself in a situation where these files could cause me problems. I am very polite in these public venues.

The items that should not be 'visible' to the outside world are my password lists, private diary entries and thoughts, my photos and personal documents, including scanned copies of my passport and my bank and credit card details. My personal address database is also private and needs to be protected. I store these files as 'invisible' items for local use only inside encrypted volumes on the local hard disks of my personal computers. These files are synchronized to each other by using batch files to send them over the local network and with a portable hard drive, which is also encrypted. There are other layers of encryption. My private documents and databases can only be opened in their applications with a password. I use TrueCrypt as the tool to make encrypted virtual local drives within the hard drives of my computers and I 'hide' my private files within them. I try to ensure that these files have never been sent across my ISP's networks, or into the 'cloud'. I do not mount these virtual drives except when I need them and I do not leave them 'open' when I am surfing or using the Internet. If I had really sensitive (military-equivalent) data, I would keep it on a notebook computer which never accessed any network and which did not have any USB ports. Backing up hidden files to an on-line resource is clearly not a good idea, so I have to be responsible for making safe copies myself. I do not  allow any access to my computers from other computers outside my local network, although I recognize that this does not prevent others from maliciously entering my network and computers through crafted web pages, Trojans or key loggers. I just have to be alert and compute as safely as I can.

But the 'cloud' has many virtues. One is the excellent feature of many of Google's on-line resources that are often linked to each other: I can write a blog in my gmail account or in a Google Doc and publish it on my blog directly from either of these applications, as well as editing the blog directly in Blogger. I hope that other cloud resources will work as well as these, so that I need only one portal for many of the on-line files that I generate and use. Perhaps this will be a feature of  Google's OS?

I conclude that I have to compute in a schizophrenic manner, with two sets of rules, one for public and one for sensitive data. I have to decide which type of data is in each file before I create it.

I believe that a local encrypted virtual drive should be available to every user, to contain sensitive data in hidden files. Add that strong recommendation to my earlier one: that every computer should have two hard drives one of which is removable and which is used to clone the system drive and to replace it when it dies, a fate (like taxes) that all are subject to. Plan ahead!

Christopher Spry
Wimbledon, London
Updated 15:37, 29 June 2010

My website is cramped

Published TUESDAY, AUGUST 15, 2006


My website is cramped

It is surprising, in this age when hard disk space is cheap and widely available, that Internet Service Providers (ISPs) limit the space available to their customers. For example, I have a community website, for which I pay nearly £100/year, which limits me to 80 MB total space. I should like to add still pictures, videos, sound files and so on, but as my web space is already full (in fact they kindly allowed me 10 MB extra space) there is currently no possibility of doing this.

At another of my website, I have arranged for searches and access to be directed through my broadband connection into my house, where they are connected to the website which I am running under Windows XP. There is no limit on the size of the website I can provide here. Fortunately, this ISP does not charge for this useful service, so I do not need to purchase additional web space beyond the small amounts they allow me to use as part of my broadband account.

I see that there are some ISPs who are offering much more space that usual, but this does not seem to have stimulated the principal ISPs to increase theirs.

Major disadvantages of serving my web from my home computer, are that the uplink speed is slow, only one web can be made available this way for each broadband connection and the computer has to be on all the time. Many computers are not designed to run continually and there may be power supply or hard disk problems ahead. Perhaps I should consider buying a dedicated server to serve my web from home. My first searches on the Internet for a suitable computer have not been very successful. For about £350 I can buy a server that this has no monitor, keyboard or mouse. I suppose I could load Linux onto it and administer it from another computer. Suggestions about the best way to proceed would be gratefully received.

I'm sure there are many people like me who have web sites which are cramped and expensive to run. For us, a computer that would run continually and silently from home, would be a great step forward.

Christopher Spry
London, 15 August 2006

My new blogs

My blogs about the Natural World are at http://christopher-spry.blogspot.com/. Here, I plan to comment on computing issues that affect me as I work with, and update several Intel/Windows PCs in Wimbledon, where the  Internet connection is up to 3.5 Mb/sec and in Senni, Wales, where the Internet is only available at up to 0.5 Mb/sec.


Christopher Spry
Wimbledon, London
15:19, 25 June 2010

Monday, December 05, 2005

Keeping software up to date and secure can be a nightmare

Today, I have been updating my web page that lists over 100 of the programs that I use. I have been trying to find and install updates for all of them, many of which could be security updates. It has taken all day. It was often a frustrating experience and it indicates to me that much of the software industry is still in its infancy in this area. This blog summarizes some of the problems I came across.

Some of the vendors were not able to provide access from their main pages to their software without trying to force me to buy products that I did not want (Adobe and Real Networks for example). Others did not provide information on the latest version of their software, until I had looked several pages deep into their sites, and then only when I saw the name of the download. I often had to update the whole program only after uninstalling the previous version manually. There were multiple reboots to contend with. Some had several programs packaged together, so that I had to retrieve all of them, when I only wanted part. Some discussed updates without actually providing them (vapourware, this used to be called). Some of the updates could only be installed if I manually uninstalled the previous version. QuickTime told me that my system could be unstable if I did uninstall the previous version, so I was not able to install the update I need to deal with a problem running the program. That one, I will just have to do without now.

Very few vendors email me when new versions are released, even though I have registered my copy with them (and used a key that locks the software to one computer) in the vain hope that they would. Fortunately, some have now added update options to the 'Help' tab in their software, but I have to look for them there. A few check for updates automatically and offer to download them. One of these 'clever' programs keeps telling me, incorrectly, to get an update. Some software has changed name, without making this clear. Many web sites have altered their update web pages, without providing links to the new web sites. Some vendors do not have updates for programs that they should - the user has to buy a full version when he/she only wants an update.

It is a disgrace that the software industry is in such disarray, when it comes to updates. This problem is not limited to applications. All the operating systems that I use are poor at this, as I know from hours spent with Windows, Linux, Solaris and Irix updates. It is rare for any vendor or software manufacturer to send me an email that they have an update. The principal exceptions to this are individuals who have specialized software, but they can offer me updates that are cosmetic or only deal with minor issues, which are not listed in the email, so I have to check at their website first. Commercial software organizations are happy to send me advertizing in emails, but rarely does this contain update information, and it never knows what software I am using, even though I registered it with them and the software can even check back to their website for other purposes. Some organizations require users to pay through their support sites to find out if there is an update.

So, here I am, having spent another difficult day, struggling around the net to do something that ought to take place in the background, automatically, while my computers are idle.
Christopher Spry, 5 December 2005

Firefox v 1.5

Firefox v 1.5 from Mozilla.org

I have been using Internet Explorer for many years but this weekend I have transferred to Firefox 1.5. I am delighted with it. I am impressed with how well it works: fast, simple to configure and available with a host of functions that I use regularly. I particularly like tabbed internet browsing and bookmarks. It reads RSS feeds and has a simple appearance that I like. I strongly recommend it to Windows users. Be aware that it will take a while for new users to find out all that it can do, and I spend some hours configuring it and setting it up with my most useful bookmarks and settings, although bookmarks in Internet Explorer were imported for me. I shall be interested to see if the new version of Internet Explorer, which has been several years in preparation, will come up to these standards.

Christopher Spry

Wednesday, November 16, 2005

Blog software review

Three blog software products are reviewed at SitePoint. These are self-hosted, so they require some installation and maintenance. Blogger does it all for you.

Christopher Spry