Exposing Azure Blobs Through Forest Admin

Andrew Varnon
1 min readJan 6, 2021

--

This is a continuation of Exposing Azure Table Storage Through Forest Admin. After a month of use I found that the data that I was storing in my JsonData column in Table Storage was growing too large and needed to be moved into a Blob. I still wanted the same editor experience through Forest Admin that I had previously. To support this I needed to make a wrapper around Blob storage and make CRUD operations through it for the JSON data.

  1. Install @azure/storage-blob
npm install @azure/storage-blob

2. Create an Azure Blob Storage Service

const { BlobServiceClient } = require('@azure/storage-blob');
const { Readable } = require('stream');
const azureBlobStorageService = {
deleteIfExistsAsync: async (containerName, blobName) => {
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
await blockBlobClient.deleteIfExists();
},
downloadAsync: async (containerName, blobName) => {
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const buffer = await blockBlobClient.downloadToBuffer();
return buffer;
},
existsAsync: async (containerName, blobName) => {
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const exists = await blockBlobClient.exists();
return exists;
},
uploadAsync: async (containerName, blobName, bufferedContents) => {
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const readableStream = new Readable();
readableStream._read = () => {}; // _read is required but you can noop it
readableStream.push(bufferedContents);
readableStream.push(null);
await blockBlobClient.uploadStream(readableStream);
},
};
module.exports = azureBlobStorageService;

3. Integrate with the existing routes

Getting the JSON:

if (await azureBlobStorageService.existsAsync(containerName, blobFileName)) {
jsonData = (await azureBlobStorageService.downloadAsync(containerName, blobFileName)).toString();
}

Setting the JSON:

await azureBlobStorageService.uploadAsync(containerName, blobFileName, Buffer.from(jsonData));

--

--

Andrew Varnon
Andrew Varnon

Written by Andrew Varnon

I am a full stack developer and architect, specializing in .Net and Azure.

No responses yet