Refactor code to upload image to Firebase Storage and push url to JSON [on hold]
up vote
-1
down vote
favorite
I have been trying now for days to refactor this code so that it uploads each image to Firebase Storage, then once complete, will save a json file containing each of the Storage urls. In it's current format, I can upload a small number of files to Storage, but the function returns before the urls have been obtained and as such the json file is empty. The folder on my disk from which the files are uploaded contains almost 1000 images, hence why i can't upload them and obtain the URLs manually. How can i alter this such that I am provided with a json file containing each of the URLS:
var fs = require('fs')
const { Storage } = require('@google-cloud/storage')
const projectId = 'PROJECT_ID'
// Creates a client
const storage = new Storage({
projectId: projectId,
keyFilename: 'auth.json'
})
// Reference the bucket
var bucket = storage.bucket('BUCKET_NAME')
// This reads the folder where the images are stored on my hard disk
const folderContainingImages = 'FOLDER_PATH'
async function iterate (filePath, file, array) {
await bucket.upload(filePath, (err, file) => {
if (err) { return console.error(err) }
let filename = file.metadata.name
let publicUrl = `https://firebasestorage.googleapis.com/v0/b/${projectId}.appspot.com/o/${filename}?alt=media`
console.log(`${publicUrl},`)
array.push(publicUrl)
})
}
async function uploadImage () {
async function fileReadWrite () {
const storageURLsArray =
await fs.readdir(folderContainingImages, (err, files) => {
if (err) {
console.error(`Could not read the directory`, err)
}
files.forEach(function (file) {
// let internalData =
var filePath = `${folderContainingImages}/` + file
// Upload a local file to a new file to be created in the bucket
iterate(filePath, file, storageURLsArray)
})
})
return storageURLsArray
}
const data = await fileReadWrite()
return data
}
uploadImage().then((value) => {
// console.log(JSON.stringify(value, null, 2));
fs.writeFile(
'./jsonData.json',
JSON.stringify(value, null, 2),
(err) => err ? console.error('error data not written', err) : console.log('Data written!')
)
})
javascript node.js async-await firebase google-cloud-platform
put on hold as off-topic by 200_success, vnp, Jamal♦ 2 days ago
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "Code not implemented or not working as intended: Code Review is a community where programmers peer-review your working code to address issues such as security, maintainability, performance, and scalability. We require that the code be working correctly, to the best of the author's knowledge, before proceeding with a review." – 200_success, vnp, Jamal
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
up vote
-1
down vote
favorite
I have been trying now for days to refactor this code so that it uploads each image to Firebase Storage, then once complete, will save a json file containing each of the Storage urls. In it's current format, I can upload a small number of files to Storage, but the function returns before the urls have been obtained and as such the json file is empty. The folder on my disk from which the files are uploaded contains almost 1000 images, hence why i can't upload them and obtain the URLs manually. How can i alter this such that I am provided with a json file containing each of the URLS:
var fs = require('fs')
const { Storage } = require('@google-cloud/storage')
const projectId = 'PROJECT_ID'
// Creates a client
const storage = new Storage({
projectId: projectId,
keyFilename: 'auth.json'
})
// Reference the bucket
var bucket = storage.bucket('BUCKET_NAME')
// This reads the folder where the images are stored on my hard disk
const folderContainingImages = 'FOLDER_PATH'
async function iterate (filePath, file, array) {
await bucket.upload(filePath, (err, file) => {
if (err) { return console.error(err) }
let filename = file.metadata.name
let publicUrl = `https://firebasestorage.googleapis.com/v0/b/${projectId}.appspot.com/o/${filename}?alt=media`
console.log(`${publicUrl},`)
array.push(publicUrl)
})
}
async function uploadImage () {
async function fileReadWrite () {
const storageURLsArray =
await fs.readdir(folderContainingImages, (err, files) => {
if (err) {
console.error(`Could not read the directory`, err)
}
files.forEach(function (file) {
// let internalData =
var filePath = `${folderContainingImages}/` + file
// Upload a local file to a new file to be created in the bucket
iterate(filePath, file, storageURLsArray)
})
})
return storageURLsArray
}
const data = await fileReadWrite()
return data
}
uploadImage().then((value) => {
// console.log(JSON.stringify(value, null, 2));
fs.writeFile(
'./jsonData.json',
JSON.stringify(value, null, 2),
(err) => err ? console.error('error data not written', err) : console.log('Data written!')
)
})
javascript node.js async-await firebase google-cloud-platform
put on hold as off-topic by 200_success, vnp, Jamal♦ 2 days ago
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "Code not implemented or not working as intended: Code Review is a community where programmers peer-review your working code to address issues such as security, maintainability, performance, and scalability. We require that the code be working correctly, to the best of the author's knowledge, before proceeding with a review." – 200_success, vnp, Jamal
If this question can be reworded to fit the rules in the help center, please edit the question.
This sounds like a request to rewrite the code to change its behavior, which is beyond the scope of Code Review.
– 200_success
2 days ago
add a comment |
up vote
-1
down vote
favorite
up vote
-1
down vote
favorite
I have been trying now for days to refactor this code so that it uploads each image to Firebase Storage, then once complete, will save a json file containing each of the Storage urls. In it's current format, I can upload a small number of files to Storage, but the function returns before the urls have been obtained and as such the json file is empty. The folder on my disk from which the files are uploaded contains almost 1000 images, hence why i can't upload them and obtain the URLs manually. How can i alter this such that I am provided with a json file containing each of the URLS:
var fs = require('fs')
const { Storage } = require('@google-cloud/storage')
const projectId = 'PROJECT_ID'
// Creates a client
const storage = new Storage({
projectId: projectId,
keyFilename: 'auth.json'
})
// Reference the bucket
var bucket = storage.bucket('BUCKET_NAME')
// This reads the folder where the images are stored on my hard disk
const folderContainingImages = 'FOLDER_PATH'
async function iterate (filePath, file, array) {
await bucket.upload(filePath, (err, file) => {
if (err) { return console.error(err) }
let filename = file.metadata.name
let publicUrl = `https://firebasestorage.googleapis.com/v0/b/${projectId}.appspot.com/o/${filename}?alt=media`
console.log(`${publicUrl},`)
array.push(publicUrl)
})
}
async function uploadImage () {
async function fileReadWrite () {
const storageURLsArray =
await fs.readdir(folderContainingImages, (err, files) => {
if (err) {
console.error(`Could not read the directory`, err)
}
files.forEach(function (file) {
// let internalData =
var filePath = `${folderContainingImages}/` + file
// Upload a local file to a new file to be created in the bucket
iterate(filePath, file, storageURLsArray)
})
})
return storageURLsArray
}
const data = await fileReadWrite()
return data
}
uploadImage().then((value) => {
// console.log(JSON.stringify(value, null, 2));
fs.writeFile(
'./jsonData.json',
JSON.stringify(value, null, 2),
(err) => err ? console.error('error data not written', err) : console.log('Data written!')
)
})
javascript node.js async-await firebase google-cloud-platform
I have been trying now for days to refactor this code so that it uploads each image to Firebase Storage, then once complete, will save a json file containing each of the Storage urls. In it's current format, I can upload a small number of files to Storage, but the function returns before the urls have been obtained and as such the json file is empty. The folder on my disk from which the files are uploaded contains almost 1000 images, hence why i can't upload them and obtain the URLs manually. How can i alter this such that I am provided with a json file containing each of the URLS:
var fs = require('fs')
const { Storage } = require('@google-cloud/storage')
const projectId = 'PROJECT_ID'
// Creates a client
const storage = new Storage({
projectId: projectId,
keyFilename: 'auth.json'
})
// Reference the bucket
var bucket = storage.bucket('BUCKET_NAME')
// This reads the folder where the images are stored on my hard disk
const folderContainingImages = 'FOLDER_PATH'
async function iterate (filePath, file, array) {
await bucket.upload(filePath, (err, file) => {
if (err) { return console.error(err) }
let filename = file.metadata.name
let publicUrl = `https://firebasestorage.googleapis.com/v0/b/${projectId}.appspot.com/o/${filename}?alt=media`
console.log(`${publicUrl},`)
array.push(publicUrl)
})
}
async function uploadImage () {
async function fileReadWrite () {
const storageURLsArray =
await fs.readdir(folderContainingImages, (err, files) => {
if (err) {
console.error(`Could not read the directory`, err)
}
files.forEach(function (file) {
// let internalData =
var filePath = `${folderContainingImages}/` + file
// Upload a local file to a new file to be created in the bucket
iterate(filePath, file, storageURLsArray)
})
})
return storageURLsArray
}
const data = await fileReadWrite()
return data
}
uploadImage().then((value) => {
// console.log(JSON.stringify(value, null, 2));
fs.writeFile(
'./jsonData.json',
JSON.stringify(value, null, 2),
(err) => err ? console.error('error data not written', err) : console.log('Data written!')
)
})
javascript node.js async-await firebase google-cloud-platform
javascript node.js async-await firebase google-cloud-platform
edited Nov 14 at 19:26
asked Nov 14 at 19:06
pho_pho
1065
1065
put on hold as off-topic by 200_success, vnp, Jamal♦ 2 days ago
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "Code not implemented or not working as intended: Code Review is a community where programmers peer-review your working code to address issues such as security, maintainability, performance, and scalability. We require that the code be working correctly, to the best of the author's knowledge, before proceeding with a review." – 200_success, vnp, Jamal
If this question can be reworded to fit the rules in the help center, please edit the question.
put on hold as off-topic by 200_success, vnp, Jamal♦ 2 days ago
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "Code not implemented or not working as intended: Code Review is a community where programmers peer-review your working code to address issues such as security, maintainability, performance, and scalability. We require that the code be working correctly, to the best of the author's knowledge, before proceeding with a review." – 200_success, vnp, Jamal
If this question can be reworded to fit the rules in the help center, please edit the question.
This sounds like a request to rewrite the code to change its behavior, which is beyond the scope of Code Review.
– 200_success
2 days ago
add a comment |
This sounds like a request to rewrite the code to change its behavior, which is beyond the scope of Code Review.
– 200_success
2 days ago
This sounds like a request to rewrite the code to change its behavior, which is beyond the scope of Code Review.
– 200_success
2 days ago
This sounds like a request to rewrite the code to change its behavior, which is beyond the scope of Code Review.
– 200_success
2 days ago
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
This sounds like a request to rewrite the code to change its behavior, which is beyond the scope of Code Review.
– 200_success
2 days ago