I thought I was going to describe how to work around Microsoft no longer supporting the automatic extraction of EXIF data within SharePoint Online libraries. However, while writing this article I discovered that my M365 Tenant still supports the automatic extraction of EXIF data. As a result, the solution will include “duplicated: fields (columns): one set of fields to allow SharePoint Online to do its thing, and the second set of to demonstrate how to create a Cloud Flow to processing the EXIF data.
What is EXIF Metadata?
EXIF, or Exif, stands for Exchangeable Image File, and it is a format used by digital cameras and smartphones to add metadata to image files, providing information about the images. This metadata can include details such as the image’s height and width, the camera’s make and model that captured the image, and can encompass settings like aperture and exposure, along with GPS coordinates for devices equipped with GPS capabilities.
SharePoint Online “used to have” built-in functionality that automatically extracted an image’s EXIF metadata and saved it to “special” columns, provided that you had previously manually added these fields to your library.
Background / Business Problem
In early 2021, a customer reported that their SharePoint “Photo” Library was no longer extracting and saving GPS coordinates for new photos when adding them to the library. During that period, many individuals had reported that Microsoft had removed this functionality from SharePoint Online. Consequently, I needed to devise a solution to handle this task.
Setting up SharePoint
The next few sections describe the creation of the SharePoint assets (site columns, content type, and library) required for the solution and assume that you already have a site collection. Feel free to skip these sections and return to them later if you wish to replicate the solution.
Create Site Columns
To create site columns, you will need to navigate to the ‘Sites Settings’ page. One option is to click on the Cog -> Site Information -> View all site settings. An alternative is to edit the following URL: https://tenant.sharepoint.com/sites/Site/_layouts/15/settings.aspx
Click on ‘Site columns’ link and create the following site columns in a new group called ‘Sandpit Site Columns’.
TIP: Don’t forget to create you site column using its internal name and then rename it.
Internal Name | Description | Type | Other | Name |
---|---|---|---|---|
wic_System_GPS_Longitude | EXIF data automatically extracted by the SharePoint’s metadata service. | Single line of text | wic_System_GPS_Longitude | |
wic_System_GPS_Latitude | EXIF data automatically extracted by the SharePoint’s metadata service. | Single line of text | wic_System_GPS_Latitude | |
wic_System_GPS_Altitude | EXIF data automatically extracted by the SharePoint’s metadata service. | Single line of text | wic_System_GPS_Altitude | |
Longitude | Data extracted from the photograph using Power Automate and the SharePoint REST API’s RenderListDataAsStream function. | Single line of text | Longitude | |
Latitude | Data extracted from the photograph using Power Automate and the SharePoint REST API’s RenderListDataAsStream function. | Single line of text | Latitude | |
Altitude | Data extracted from the photograph using Power Automate and the SharePoint REST API’s RenderListDataAsStream function. | Single line of text | Altitude | |
RawExifData | Data extracted from the photograph using Power Automate and the SharePoint REST API’s RenderListDataAsStream function. | Multiple lines of text | Allow unlimited length in document libraries: Yes | Raw Exif Data |
UpdateEXIFMetadata | Update the EXIF metadata fields when set to Yes. | Yes/No | Default: Yes | Update EXIF Metadata |
Create Content Type
As with the Site Columns, we will navigate to the Site Settings page, but this time click on ‘Site content types‘ link.
Click on ‘+ Create content types‘ and enter the following information:
- Name: Photograph
- Description: Digital photograph including the EXIF (Exchangeable Image File) Metadata.
- Create a new category: Sandpit Content Types
- Parent category: Picture
- Content type: Sandpit Content Types
Click ‘Create’.
Now add the site columns… click ‘+ Add site columns‘ à ‘Add from existing site columns‘:
- Select site columns from existing category: Sandpit Site Columns*.
- Add the column, click ‘Save’ and then update the order.
Create a Picture library
Unlike ‘Classic’ SharePoint sites, ‘Modern’ SharePoint Online sites do not have a dedicated Images library template, you could either utilize the ‘Sites Assets’ library or in this example create a custom Picture library.
Navigate to ‘Sites Content‘ and select ‘+New‘ -> ‘Document Library‘ and enter the following information:
- Name: Photographs
- Description: Library of digital photograph including their EXIF (Exchangeable Image File) Metadata where available.
Once created, navigate to the library’s ‘Settings‘ -> ‘Advanced settings‘ and set ‘Allow management of content types?‘ to Yes, and click OK. Next:
- Click on ‘Add from existing site content types‘ and add the ‘Photograph‘ content type.
- Next, click on ‘Document‘ content type and delete it.
- Optional: You can then navigate back to ‘Advanced Settings‘ and set ‘Allow management of content types?‘ to No.
Another enhancement I like to make to the library it to:
- Rename the default View to ‘All Photographs‘ and add our new columns.
- Create a new View using the ‘Gallery‘ template and call it ‘Photo Gallery‘
- Set the ‘Photo Gallery‘ View to be the default for the library.
- Create a new View based upon the ‘All Photographs’ View, call it ‘zzzCloudFLow‘ and add all of the columns listed in the table above. The ‘zzzCloudFLow’ View will be used by the Cloud Flow when retrieving data.
Now that we have a basic Picture library, we can focus on extracting an images EXIF attributes.
SharePoint Permissions
The service account that your Cloud Flow uses to connect to SharePoint will require Design permission level on the Photographs library, to override files that have been checked out by others.
One option is to navigate to the library’s settings page and:
- Click on ‘Permissions for this document library’ -> ‘Stop Inheriting Permissions’.
- Click on ‘Grant Permissions’, enter your service account’s name:
- Click on ‘Show Options’
- Unselect: Send an email invitation
- Select a permission level: design
- Click on Share
Create a Power Platform Solution
Before creating a Power Automate Cloud Flow it is “good practice” to create a Power Platform Solution. Open the PowerApps or the Power Automate Make portals and navigate to the Solutions page and clicking ‘+ New solution‘.
- Give your solution a name, for example:
- Display name: Sandpit: Manage Image Attributes
- Name: SandpitManageImageAttributes
- Select a Publisher. If you don’t already have a Publishing Profile, see my ‘How to create a Power Platform Publishing Profile‘ article.
- Click Create
Next, add the following objects:
Internal Name | Description | Type |
---|---|---|
Sandpit-MIA-SharePoint | Connection reference for the ‘Sandpit: Manage Image Attributes’ solution. | Connection Reference |
Sandpit-MIA-SPSite | Connection to the site collection. | Environment Variable |
Sandpit-MIA-SPLibrary | Connection to the Image library. | Environment Variable |
Sandpit-MIA-SPView | The GUID for the ‘zzzCloudFlow’ view. | Environment Variable |
Sandpit-MIA-Tenant | The tenant’s name used within the SharePoint URL. | Environment Variable |
Create a Power Automate Cloud Flow
Open your solution and create a new Cloud Flow:
Open your solution and create a new Cloud Flow:
- Click on: + New -> Automation -> Cloud flow -> Automated.
- Flow name: Apply EXIF Image Attributes
- Choose your flow trigger: ‘SharePoint – When a file is created or modified (properties only)‘ and set the:
- Connection reference to: Sandpit-MIA-SharePointTrigger conditions to: @equals(triggerBody()?[‘UpdateEXIFMetadata’],true)Site address to: Any site – we will update this shortly
- Library name to: Documents – we will update this shortly
- Add a ‘New step’ called Terminate and set the:
- Name to: Terminate | Succeeded
- Status to: Succeeded
- Save the flow
Your Flow should look similar to the image below:
Having saved the Flow it becomes “solution-aware,” which essentially means that you can select the environment variables created earlier. However, after saving the flow, click the ‘<- Go back to previous page‘ button to exit the edit screen, and then re-edit the Flow.
Now that the Flow is solution “aware” the trigger properties can be updated. For example, select the Site Address property -> Enter a custom value -> select: Sandpit-MIA-SPSite.
Follow the same steps for the ‘Library Name’, this time selecting: Sandpit-MIA-SPLibrary, and for the ‘Limit Columns by View,’ choose: Sandpit-MIA-SPView, and save the flow.
The end results should look similar to the image below:
TIP: I like to add ‘notes’ to my triggers and steps: 1) to describe any hidden functions or intentions that the step’s name cannot convey, and 2) to save expressions to make support easier.
The Flow can be tested at any point simply by a adding an image to the Photographs library to ensure the Flow is triggering correctly.
Adding the logic to the Cloud Flow
Just below the Cloud Flow’s trigger add an Array variable called ‘Initialize Variable | MetaInfoValues‘ and a Delay step called ‘Delay | 20 seconds‘. Hopefully the settings are self-explanatory.
The MetaInfoValues array will be used later on to store the EXIF data, and the Delay is to give SharePoint time to finish save the file when it is first added to the library.
Parallel Branches
Add a ‘parallel branch’ to the Cloud Flow; while this may not speed up the Flow, it does make the logic easier to read.
One branch will get the ‘List Item Entity Type Full Name‘ and the other branch will retrieve the ‘Render List Data as Stream‘.
List Item Entity Type Full Name
Added a scope to the left branch called ‘Scope | Get List Item Entity Type Full Name‘ and then add the following steps:
Type | Name |
---|---|
Send an HTTP request to SharePoint | Send an HTTP request to SharePoint | GET | ListItemEntityTypeFullName |
Parse JSON | Parse JSON | ListItemEntityTypeFullName |
Compose | Compose | ListItemEntityTypeFullName |
Properties for the ‘Send an HTTP request to SharePoint | GET | ListItemEntityTypeFullName‘ step:
Name | Value |
---|---|
Site Address | Sandpit-MIA-SPSite |
Method | GET |
Uri | /_api/web/lists(guid’@{parameters(‘Sandpit-MIA-SPLibrary (sjl_SandpitMIASPLibrary)’)}’)/?$select=ListItemEntityTypeFullName |
Headers #1 | Accept: application/json;odata=verbose |
Headers #2 | Content-Type: application/json |
Properties for the ‘Parse JSON | ListItemEntityTypeFullName‘ step:
Name | Value |
---|---|
Content | @{body(‘Send_an_HTTP_request_to_SharePoint_|_GET_|_ListItemEntityTypeFullName’)} |
Schema | { “type”: “object”, “properties”: { “d”: { “type”: “object”, “properties”: { “__metadata”: { “type”: “object”, “properties”: { “id”: { “type”: “string” }, “uri”: { “type”: “string” }, “etag”: { “type”: “string” }, “type”: { “type”: “string” } } }, “ListItemEntityTypeFullName”: { “type”: “string” } } } } } |
Properties for the ‘Compose | ListItemEntityTypeFullName‘ step:
Name | Value |
---|---|
Note | { “type” : “@{body(‘Parse_JSON_|_Image_Properties’)?[‘d’]?[‘__metadata’]?[‘type’]}” } |
Schema | { “type”: “@{body(‘Parse_JSON_|_ListItemEntityTypeFullName’)?[‘d’]?[‘ListItemEntityTypeFullName’]}” } |
Render List Data as Stream
Added a scope to the right branch called ‘Scope | Get Render List Data As Stream‘, and add the following steps to the that:
Type | Name |
Compose | Compose | REST Body Inner | CAML Query |
Compose | Compose | REST Body JSON |
Send an HTTP request to SharePoint | Send an HTTP request to SharePoint | POST | RenderListDataAsStream |
Parse JSON | Parse JSON | RenderListDataAsStream Body |
Properties for the ‘Compose | REST Body Inner | CAML Query‘ step:
Name | Value |
Note | CAML Query used to select the relevant fields and filter the data by Item ID and UpdateEXIFMetadata. Make sure you use single quotes to avoid issues when combining with the JSON string. |
Input | <View Scope=’RecursiveAll’> <ViewFields> <FieldRef Name=’ID’ /> <FieldRef Name=’FileLeafRef’ /> <FieldRef Name=’FileDirRef’ /> <FieldRef Name=’FileRef’ /> <FieldRef Name=’FileLeafRef.Name’ /> <FieldRef Name=’FileLeafRef.Suffix’ /> <FieldRef Name=’File_x0020_Type’ /> <FieldRef Name=’MetaInfo’ /> <FieldRef Name=’MetaInfo.’ /> <FieldRef Name=’MediaServiceMetadata’ /> <FieldRef Name=’MediaServiceFastMetadata’ /> <FieldRef Name=’MediaServiceDateTaken’ /> <FieldRef Name=’MediaServiceAutoTags’ /> <FieldRef Name=’MediaServiceLocation’ /> <FieldRef Name=’MediaServiceOCR’ /> <FieldRef Name=’ParentUniqueId’ /> <FieldRef Name=’wic_System_GPS_Latitude’ /> <FieldRef Name=’wic_System_GPS_Longitude’ /> <FieldRef Name=’wic_System_GPS_Altitude’ /> </ViewFields> <Query> <Where> <And> <Eq> <FieldRef Name=’ID’ /> <Value Type=’Number’>@{triggerOutputs()?[‘body/ID’]}</Value> </Eq> <Eq> <FieldRef Name=’UpdateEXIFMetadata’ /> <Value Type=’Boolean’>1</Value> </Eq> </And> </Where> </Query> <RowLimit Paged=’TRUE’>5000</RowLimit> </View> |
Properties for the ‘Compose | REST Body JSON‘ step:
Name | Value |
Note | We need to remove the CAML/XML formatting before embedding in the JSON string: @{uriComponentToString(replace(replace(uriComponent(outputs(‘Compose_|_REST_Body_Inner_|_CAML_Query’)), ‘%0A’, ”), ‘%09’,”))} |
Input | { “parameters”: { “RenderOptions”: 4103, “ViewXml”: “@{uriComponentToString(replace(replace(uriComponent(outputs(‘Compose_|_REST_Body_Inner_|_CAML_Query’)), ‘%0A’, ”), ‘%09’,”))}” } } |
Properties for the ‘Send an HTTP request to SharePoint | POST | RenderListDataAsStream‘ step:
Name | Value |
Note | “RenderOptions”: 2 // ListData |
Site Address | Sandpit-MIA-SPSite |
Method | POST |
Uri | _api/web/lists(guid’@{parameters(‘Sandpit-MIA-SPLibrary (sjl_SandpitMIASPLibrary)’)}’)/RenderListDataAsStream |
Headers #1 | Accept: application/json;odata=verbose |
Headers #2 | Content-Type: application/json |
Body | @{outputs(‘Compose_|_REST_Body_JSON’)} |
Properties for the ‘Parse JSON | RenderListDataAsStream Body‘ step:
Name | Value |
Content | @{body(‘Send_an_HTTP_request_to_SharePoint_|_POST_|_RenderListDataAsStream’)} |
Schema | { “type”: “object”, “properties”: { “wpq”: { “type”: “string” }, “Templates”: { “type”: “object”, “properties”: {} }, “ListData”: { “type”: “object”, “properties”: { “Row”: { “type”: “array”, “items”: { “type”: “object”, “properties”: { “ID”: { “type”: “string” }, “MetaInfo.”: { “type”: “string” } }, “required”: [ “ID”, “MetaInfo.” ] } }, “FirstRow”: { “type”: “integer” }, “FolderPermissions”: { “type”: “string” }, “LastRow”: { “type”: “integer” }, “RowLimit”: { “type”: “integer” }, “FilterLink”: { “type”: “string” }, “ForceNoHierarchy”: { “type”: “string” }, “HierarchyHasIndention”: { “type”: “string” }, “CurrentFolderSpItemUrl”: { “type”: “string” } } } } } |
Join the Branches
Join the branches and add a scop called ‘Scope | Process MetaInfo Data‘ and add the following six steps:
1. ‘Compose | MetaInfo‘:
Name | Value |
Note | @{first(array(first(array(body(‘Parse_JSON_|_RenderListDataAsStream_Body’)?[‘ListData’]?[‘Row’]))?[‘MetaInfo’]))?[‘lookupValue’]} |
Input | @{first(array(body(‘Parse_JSON_|_RenderListDataAsStream_Body’)?[‘ListData’]?[‘Row’]))?[‘MetaInfo.’]} |
2. ‘Set variable | MetaInfoValues‘:
Name | Value | |
Note | @{split(replace(replace(replace(outputs(‘Compose_|_MetaInfo’), decodeUriComponent(‘%09’), ‘ ‘), decodeUriComponent(‘%0D%0A’), decodeUriComponent(‘%0d’)), decodeUriComponent(‘%0A’), decodeUriComponent(‘%0d’)), decodeUriComponent(‘%0d’))} | |
Name | MetaInfoValues | |
Value | @{split(replace(replace(replace(outputs(‘Compose_|_MetaInfo’), decodeUriComponent(‘%09’), ‘ ‘), decodeUriComponent(‘%0D%0A’), decodeUriComponent(‘%0d’)), decodeUriComponent(‘%0A’), decodeUriComponent(‘%0d’)), decodeUriComponent(‘%0d’))} |
3. ‘Filter array | wic System GPS Latitude‘:
Name | Value |
Note | @startsWith(item(), ‘wic_System_GPS_Latitude:’) |
From | @variables(‘MetaInfoValues’) |
By | @startsWith(item(), ‘wic_System_GPS_Latitude:’) |
4. ‘Filter array | wic System GPS Longitude‘:
Name | Value |
Note | @startsWith(item(), ‘wic_System_GPS_Longitude:’) |
From | @variables(‘MetaInfoValues’) |
By | @startsWith(item(), ‘wic_System_GPS_Longitude:’) |
5. ‘Filter array | wic System GPS Altitude‘:
Name | Value |
Note | @startsWith(item(), ‘wic_System_GPS_Altitude:’) |
From | @variables(‘MetaInfoValues’) |
By | @startsWith(item(), ‘wic_System_GPS_Altitude:’) |
6. ‘Filter array | wic System GPS Latitude‘:
Name | Value |
Note | Example expression: @{replace(replace(coalesce(first(body(‘Filter_array_|_wic_System_GPS_Latitude’)), last(body(‘Filter_array_|_wic_System_GPS_Latitude’)), ”),’wic_System_GPS_Latitude:’,”), ‘SW|’, ”)} |
From | { “__metadata”: @{outputs(‘Compose_|_ListItemEntityTypeFullName’)}, “UpdateEXIFMetadata”: “false”, “Latitude”: “@{replace(replace(coalesce(first(body(‘Filter_array_|_wic_System_GPS_Latitude’)), last(body(‘Filter_array_|_wic_System_GPS_Latitude’)), ”),’wic_System_GPS_Latitude:’,”), ‘SW|’, ”)}”, “Longitude”: “@{replace(replace(coalesce(first(body(‘Filter_array_|_wic_System_GPS_Longitude’)), last(body(‘Filter_array_|_wic_System_GPS_Longitude’)), ”),’wic_System_GPS_Longitude:’,”), ‘SW|’, ”)}”, “Altitude”: “@{replace(replace(coalesce(first(body(‘Filter_array_|_wic_System_GPS_Altitude’)), last(body(‘Filter_array_|_wic_System_GPS_Altitude’)), ”),’wic_System_GPS_Altitude:’,”), ‘SW|’, ”)}”, “RawExifData”: “@{variables(‘MetaInfoValues’)}” } |
Which will look like the image below:
File Check-out
Add a Condition step called ‘Condition | IsCheckedOut IS EQUAL TO True‘ and in the “If yes” branch add a ‘Discard check out’ step called ‘Discard check out | Photograph‘:
NOTE: You could change this logic to send the initiator an email instructing them to check the image/file in and abort the Flow.
Now that we know the Image is not check out, we will check the image out. Add a step called ‘Check out file | Photograph‘:
Update the File’s Properties
And now to update the Photograph’s properties. Add a step ‘Send an HTTP request to SharePoint’ called ‘Send an HTTP request to SharePoint | PATCH | Photograph Properties‘:
Name | Value |
---|---|
Note | EXAMPLE: {“__metadata”:@{outputs(‘Compose_|_ListItemEntityTypeFullName’)},”Subject”:”Testing 456″,”UpdateEXIFMetadata”:false} |
Site Address | Sandpit-MIA-SPSite |
Method | PATCH |
Uri | /_api/web/lists(guid’@{parameters(‘Sandpit-MIA-SPLibrary (sjl_SandpitMIASPLibrary)’)}’)/items(@{triggerOutputs()?[‘body/ID’]}) |
Header #1 | Accept: application/json;odata=verbose |
Header #2 | Content-Type: application/json;odata=verbose |
Header #3 | If-Match: * |
Header #4 | X-HTTP-Method: MERGE |
Body | @{outputs(‘Compose_|_Image_Columns_Body_|_Replace_Linefeed_And_Tab_Chars’)} |
File Check-in
Lastly, we will check in the file (Photographs), add a step called: ‘Check in file | Photograph‘:
Flow Overview
After adding all these steps, your flow should look like the following image.
Tips for Testing
Searching the “web” can give a wealth of images to test for EXIF data. However, from experience, I would recommend testing the Flow with the images or photographs that have been created using your organisation cameras or devices, so that you can confirm:
- The camera is generating the data
- The data matches the format
Setting the Update EXIF Metadata field’s default value is set to true, will ensure that files saved the Photographs library will be processed by the Cloud Flow.
While waiting for the Flow to run, one or more of the ‘wic_System_GRPS…’ fields are populated, then this may an indicate that Microsoft’s EXIF Data Service is running within your tenant.
Assuming your photograph or image has EXIF data; once the Flow has run, you should see one or more of the fields (columns) populates with data, for example: Longitude, Latitude, Altitude, and ‘Raw Exif Data’.
Not all cameras or devices are the same. The ‘Raw Exif Data’ field (column) can be used to review that data that is available and can be used to extend the solution.
Conclusion
SharePoint Online may have retained the ability to process and save EXIF data for photographs and images to specially named custom fields. However, this solution demonstrates how you can discover the field names if you choose to use the OOTB (out-of-the-box) functionality and how to create a solution using Power Automate and could be adapted for Azure Logic Apps.
There are many ways in which this could be turned into an enterprise-grade solution. Here are two examples:
- Incorporate SharePoint Search; using managed properties and refinable strings, with PnP Search web parts to presented in a searchable catalogue of photographs.
- Create a Canvas App incorporating a maps service such as Bing or Google maps to filter and display the images by locations.
The ‘Sandpit: Managed Images Attributes‘ Power Platform solution is available for downloaded from my GitHub repository by clicking here.
If you found this article useful, please consider liking and sharing it or contacting me via LinkedIn or the Contact Me form.
You must be logged in to post a comment.