Firebase-tools: Support mono-repos in deployment

Created on 31 Jan 2018  ·  47Comments  ·  Source: firebase/firebase-tools

See: https://github.com/firebase/firebase-functions/issues/172

Version info

3.17.3

Steps to reproduce

Expected behavior

Actual behavior

feature request

Most helpful comment

+1 on this, cloud functions usually will need to share some common code (e.g. interfaces) with other apps and a nice way to deal with this is a monorepo (e.g. lerna) or using symlinks directly. I took the latter and solved by creating some scripts. The concept is quite easy: I copy what's needed inside the functions directory and I remove it after

Here's how I did it with this directory structure:
```

  • root/
    | - .firebaserc
    | - firebase.json
    | - ...
  • functions/
    | - src/
    | - package.json
    | - pre-deploy.js
    | - post-deploy.js
    | - ....
  • shared/
    | - src/
    | - package.json
    | - ....
content of `pre-deploy.js`

const fs = require("fs-extra");

const packageJsonPath = "./package.json";
const packageJson = require(packageJsonPath);

(async () => {
await fs.remove(./shared);
await fs.copy(../shared, ./shared);

packageJson.dependencies["@project/shared"] = "file:./shared";

await fs.writeFile(packageJsonPath, JSON.stringify(packageJson, null, 2));

})();

content of `post-deploy.js`

const packageJsonPath = "./package.json";
const packageJson = require(packageJsonPath);
const fs = require("fs-extra");

(async () => {
await fs.remove(./shared);

packageJson.dependencies["@project/shared"] = "file:../shared";

await fs.writeFile(packageJsonPath, JSON.stringify(packageJson, null, 2));

})();

Then update `firebase.json` like this (add the build script if you need, I build before in my pipeline)

"functions": {
"source": "functions",
"predeploy": [
"npm --prefix "$RESOURCE_DIR" run pre-deploy"
],
"postdeploy": [
"npm --prefix "$RESOURCE_DIR" run post-deploy"
]
},
```

If you do the build, inside the dist or lib directory you should now have two siblings: functions and shared (this happened because of the shared dependency). Make sure to update the functions package.json main to point to lib/functions/src/index.js to make the deploy work.

For now it's solved but that's a workaround, not a solution. I think that firebase tools should really support symlinks

All 47 comments

This seems to work in my setup, i.e. deploy picks up packages from the root node_modules, even though package.json for that is located under the api/ workspace (I've used a different folder name instead of functions/). Is there anything else that needs fixing here?

EDIT: Moreover, I copy package.json into api/dist to be used by deploy.

// firebase.json
  ...
  "functions": {
    "source": "api/dist"
  },
  ...

So, 2 levels of nesting still resolve the root node_modules successfully.

@dinvlad could you share a repo?

@orouz unfortunately not yet, it's closed source for now.

Does anyone managed to tackle this problem? Sharing simple example project would be very useful.

@audkar Currently I just use lerna.js.org and it's run command to execute an npm script in each subfolder with this folder structure:

- service1/
|  - .firebaserc
|  - firebase.json
- service2/
|  - .firebaserc
|  - firebase.json
- app1/
|  - .firebaserc
|  - firebase.json
- app2/
|  - .firebaserc
|  - firebase.json
- firestore/
|  - firestore.rules
|  - firestore.indexes.json
- etc...

Ensuring the firebase.json files for each service don't stomp on one another is left up to the user. Simple conventions of using function groups and multi-site name targeting means this is solved for Cloud Functions and Hosting. Still haven't got a solution for Firestore/GCS rules yet, though splitting them up may not be ideal...

discussed here previously - https://github.com/firebase/firebase-tools/issues/1116

@jthegedus thank you for your reply. But I think issue of this ticket is different. I am trying to use yarn workspaces. And seems that firebase tools doesn't pickup symlink dependencies when uploading functions

Ah fair enough, I've avoided that rabbit hole myself

Could you elaborate what the issue is? As mentioned above, I just use bare Yarn with api and app workspaces in it, and I build them using yarn workspace api build && yarn workspace app build (with build script specific to each workspace). The build scripts
1) compile TS code with outDir into api/dist and app/dist respectively
2) copy the corresponding package.json files into dist directories
3) copy yarn.lock from the _root_ folder, into dist directories

Then I just run yarn firebase deploy from the _root_ folder, and it picks up both api/dist and app/dist without any hiccups. My firebase.json looks like

  "functions": {
    "source": "api/dist"
  },
  "hosting": {
    "public": "app/dist",

Unfortunately, I still can’t share the full code, but this setup is all that matters, afaik.

Also, I might be wrong but I think the firebase deploy script doesn’t actually use your node_modules directory. I think it just picks up the code, package.json, and yarn.lock from the dist directories, and does the rest.

That's true. The default value of "functions.ignore" in firebase.json is
["node_modules"] so it's not uploaded. I believe you can override that
though if you want to ship up some local modules.

On Mon, Jun 17, 2019, 6:58 PM Denis Loginov notifications@github.com
wrote:

Also, I might be wrong but I think the firebase deploy script doesn’t
actually use your node_modules directory. I think it just picks up the
cod, package.json, and yarn.lock from the dist directories, and does the
rest.


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/firebase/firebase-tools/issues/653?email_source=notifications&email_token=ACATB2U73VS2KIILUVRFFB3P3A6NPA5CNFSM4EOR24GKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODX46ABQ#issuecomment-502915078,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ACATB2U3Q2TLVBICRJ3B5OLP3A6NPANCNFSM4EOR24GA
.

@dinvlad Yes, it requires the package.json and whichever lock file you use as it installs deps in the cloud post deployment.

I believe the scenario originally outlined in the other issue was using a shared package within the workspace and some issues with scope-hoisting. As I was not using yarn this way I can only speculate from what I have read there.

@samtstern @jthegedus thanks, good to know!

Seems we all talk about different problems. I will try to describe yarn workspaces problem.

Problematic project

project layout

- utilities/
|  - package.json
- functions/
|  - package.json
- package.json

_./package.json_

{
  "private": true,
  "workspaces": ["functions", "utilities"]
}

_functions/package.json_

{
  <...>
  "dependencies": {
    "utilities": "1.0.0",
    <...>
  }
}

Problem

Error during function deployment:

Deployment error.
Build failed: {"error": {"canonicalCode": "INVALID_ARGUMENT", "errorMessage": "`gen_package_lock` had stderr output:\nnpm WARN deprecated [email protected]: use String.prototype.padStart()\nnpm ERR! code E404\nnpm ERR! 404 Not Found: [email protected]\n\nnpm ERR! A complete log of this run can be found in:\nnpm ERR!     /builder/home/.npm/_logs/2019-06-18T07_10_42_472Z-debug.log\n\nerror: `gen_package_lock` returned code: 1", "errorType": "InternalError", "errorId": "1971BEF9"}}

Functions works fine locally on emulator.

Solutions tried

Uploading node_modules (using functions.ignore in _firebase.json_). Result is same.

My guess that it is because utilities is created as syslink in _node-modules_ node_modules/utilities -> ../../utilities

Could it be that firebase-tools doesn't include content of symlink'ed modules when uploading (no dereferencing)?

Sorry, could you clarify which folder your firebase.json lives in (and show its configuration section for functions)?

_firebase.json_ was in root folder. Configuration was standard. Smth like this:

  "functions": {
    "predeploy": [
      "yarn --cwd \"$RESOURCE_DIR\" run lint",
      "yarn --cwd \"$RESOURCE_DIR\" run build"
    ],
    "source": "functions",
    "ignore": []
  },
  <...>

everything was deployed as expected (including _node_modules_) except node_modules/utilities which is symlink.


I manage to workaround this issue by writing few scripts which:

  • create packages for each workspace (yarn pack). e.g. this creates _utilities.tgz_.
  • moving all output to some specific dir.
  • modifying _package.json_ to use tgz files for workspace dependencies. e.g. dependencies { "utilities": "1.0.0" -> dependencies { "utilities": "file:./utilities.tgz"
  • deploying that dir to firebase

output dir content before upload:

- dist
|  - lib
|  | -index.js
|  - utilities.tgz
|  - package.json <---------- This is modified to use *.tgz for workspaces

@audkar Today I ran into the same issue as you.

I am new to both Lerna and Yarn workspaces. As I understand it you can also just use Lerna. Would that help in any way?

Your workaround seems a bit complicated for me 🤔

Also wondering, what is `--cwd "$RESOURCE_DIR" for?

--cwd stands for "current working directory" and $RESOURCE_DIR holds value for source dir (functions in this case). Adding this flag will make yarn to be executed in functions dir instead of root

@audkar Ah I see. So you could do the same with yarn workspace functions lint and yarn workspace functions build

@dinvlad It is unclear to me why you are targeting the dist folder and copying things over there. If you build to dist, but leave the package.json where it is and point main to dist/index.js then things should work the same no? You should then set source to api instead of api/dist.

@dinvlad I learned the yarn workspace command from your comments, but can't seem to make it work for some reason. See here. Any idea?

Sorry for going a bit off-topic here. Maybe comment in SO, to minimize the noise.

@0x80 I copy package.jsonto api/dist and point firebase.json to api/dist so only the "built" files are packaged inside the cloud function. I'm not sure what will happen if I point firebase.json to api - perhaps it will still be smart enough to only package what's inside api/dist (based on main attribute in package.json). But I thought it was cleaner to just point to api/dist.

Re yarn workspace, I responded on SO ;)

@dinvlad it will bundle the root of what you point it to, but you can put everything that you don't want included in the firebase.json ignore list.

I've now used a similar workaround to @audkar.

{
  "functions": {
    "source": "packages/cloud-functions",
    "predeploy": ["./scripts/pre-deploy-cloud-functions"],
    "ignore": [
      "src",
      "node_modules"
    ]
  }
}

Then the pre-deploy-cloud-functions script is:

#!/usr/bin/env bash

set -e

yarn workspace @gemini/common lint
yarn workspace @gemini/common build

cd packages/common
yarn pack --filename gemini-common.tgz
mv gemini-common.tgz ../cloud-functions/
cd -

cp yarn.lock packages/cloud-functions/

yarn workspace @gemini/cloud-functions lint
yarn workspace @gemini/cloud-functions build

And packages/cloud-functions has an extra gitignore file:

yarn.lock
*.tgz

here's what worked for me

- root/
|  - .firebaserc
|  - firebase.json
- packages/
  | - package1/
  | - functions/
    | - dist/
    | - src/
    | packages.json

and in the root/firebase.json :
```
{
"functions": {
"predeploy": "npm --prefix "$RESOURCE_DIR" run build",
"source": "packages/functions"
}
}
````

@kaminskypavel is your packages/functions depending on packages/package1 (or some other sibling package)?

@0x80 positive.

I think there was something fundamental I misunderstood about monorepos. I assumed you can share a package and deploy an app using that package without actually publishing the shared package to NPM.

It seems that this is not possible, because deployments like Firebase or Now.sh will usually upload the code and then in the cloud do an install and build. Am I correct?

@kaminskypavel I tried your approach and it works, but only after publishing my package to NPM first. Because in my case the package is private I initially got a "not found" error, so I also had to add my .npmrc file to the root of the cloud functions package as described here

@audkar Are you publishing your common package to NPM, or are you like me trying to deploy with shared code which is not published?

@0x80 I'm with you on this understanding - I think Firebase Function deployments are just (erroneously) assuming that all packages named in package.json will be available on npm, in the name of speeding up deployments.

As yarn workspace setups are becoming more popular, I imagine more folks are going to be surprised that they can't use symlinked packages in Firebase Functions – especially since they work fine until you deploy.

With npm adding support for workspaces, we have an ecosystem standard for how local packages should work.

Since this issue is over a year old, any update from the Firebase side on plans (or lack of plans) here?

I think it's a pretty cool opportunity – Firebase's array of services begs for a good monorepo setup.

+1 on this, cloud functions usually will need to share some common code (e.g. interfaces) with other apps and a nice way to deal with this is a monorepo (e.g. lerna) or using symlinks directly. I took the latter and solved by creating some scripts. The concept is quite easy: I copy what's needed inside the functions directory and I remove it after

Here's how I did it with this directory structure:
```

  • root/
    | - .firebaserc
    | - firebase.json
    | - ...
  • functions/
    | - src/
    | - package.json
    | - pre-deploy.js
    | - post-deploy.js
    | - ....
  • shared/
    | - src/
    | - package.json
    | - ....
content of `pre-deploy.js`

const fs = require("fs-extra");

const packageJsonPath = "./package.json";
const packageJson = require(packageJsonPath);

(async () => {
await fs.remove(./shared);
await fs.copy(../shared, ./shared);

packageJson.dependencies["@project/shared"] = "file:./shared";

await fs.writeFile(packageJsonPath, JSON.stringify(packageJson, null, 2));

})();

content of `post-deploy.js`

const packageJsonPath = "./package.json";
const packageJson = require(packageJsonPath);
const fs = require("fs-extra");

(async () => {
await fs.remove(./shared);

packageJson.dependencies["@project/shared"] = "file:../shared";

await fs.writeFile(packageJsonPath, JSON.stringify(packageJson, null, 2));

})();

Then update `firebase.json` like this (add the build script if you need, I build before in my pipeline)

"functions": {
"source": "functions",
"predeploy": [
"npm --prefix "$RESOURCE_DIR" run pre-deploy"
],
"postdeploy": [
"npm --prefix "$RESOURCE_DIR" run post-deploy"
]
},
```

If you do the build, inside the dist or lib directory you should now have two siblings: functions and shared (this happened because of the shared dependency). Make sure to update the functions package.json main to point to lib/functions/src/index.js to make the deploy work.

For now it's solved but that's a workaround, not a solution. I think that firebase tools should really support symlinks

@michelepatrassi inspired by what you have remind me i created firelink library for managing this case. It uses internally rsync to copy recursive files.

https://github.com/rxdi/firelink

npm i -g @rxdi/firelink

Basic usage
Assuming that you have monorepo approach and your packages are located 2 levels down from the current directory where package.json is located.

package.json

  "fireDependencies": {
    "@graphql/database": "../../packages/database",
    "@graphql/shared": "../../packages/shared",
    "@graphql/introspection": "../../packages/introspection"
  },

Executing firelink it will Copy packages related with folders then will map existing packages with local module install "@graphql/database": "file:./.packages/database", then will execute command firebase and will pass rest of the arguments from firelink command.
Basically firelink is a replacement for firebase CLI since it spawns firebase at the end when finish his job copying packages and modifying package.json!

Regards!

We just got bitten by this, I think monorepos will become the standard and this should be supported by default.

Guys, a simple solution for this problem is to use webpack to bundle your cloud-functions and deploy from the bundled directory. Attached here is a basic webpack file that I'm using. During build, this will pack all the functions code, including the dependencies resolved within the mono-repo to a toplevel folder (in this case its webpack/cloud-functions, it could be anything you configure)

const path = require('path');

module.exports = {
  target: 'node',
  mode: 'production',
  entry: './src/index.ts',
  module: {
    rules: [
      {
        test: /\.tsx?$/,
        use: 'ts-loader',
        exclude: /node_modules/
      }
    ]
  },
  resolve: {
    extensions: ['.tsx', '.ts', '.js', '.json']
  },
  output: {
    filename: 'index.js',
    path: path.resolve(__dirname, '../../webpack/cloud-functions/dist'),
    libraryTarget: 'commonjs'
  },
  externals: {
    'firebase-admin': 'firebase-admin',
    'firebase-functions': 'firebase-functions'
  }
};

And finally in your firebase.json file, refer to this folder for deployment.

{
  "functions": {
    "source": "webpack/cloud-functions"
  }
}

Remember to have a package.json file in the webpack/cloud-functions folder as well.

{
  "name": "cloud-functions",
  "version": "1.0.0",
  "scripts": {
    "deploy": "firebase deploy --only functions"
  },
  "engines": {
    "node": "10"
  },
  "main": "dist/index.js",
  "dependencies": {
    "firebase-admin": "8.9.1",
    "firebase-functions": "3.3.0"
  },
  "devDependencies": {},
  "private": true
}

This is tested and working. I'm using google cloud build. Ask for more info if required.

Thanks,

@sowdri Thanks for sharing! Do you trigger the webpack build from the firebase.json functions.predeploy step, or do you trigger it from the cloud build script before calling firebase deploy?

@0x80 I'm building it in the cloud build step. So according to firebase cli it's just a set of functions built with JS

@sowdri Thanks for the example. We've ended up using the same approach in a past project for AWS Lambdas / Serverless: it was easier to upload a (Webpack) bundle than to force the tools to work with the yarn monorepo.

I'm also stuck with this....everything worked fine (even firebase emulator) until I tried to deploy functions (web app build in react deploys OK). :( Hopefully firebase team can add support for monorepos soon.

I use babel for other packages in my monorepo, so I would prefer to stay with that. If nothing else I may try the webpack...which seems like it's working for some.

EDIT: I solved this by using GitHub package registry. So I publish all my packages there and then for travis and functions server I have to setup registry and provide a token for authentication (done via .npmrc). ...seems like an elegant solution. I got the idea for this approach here: https://medium.com/gdgeurope/how-to-use-firebase-cloud-functions-and-yarn-workspaces-24ca35e941eb

Yeah, got bitten by this as well. Everything worked perfectly in firebase serve --only functions but when deployed, it couldn't locate a module.

I ended up building a small script to build packages for me. While this reflects particulars of my environment, like using modules and my package names matching my directory names, I hope it's useful for others.

```import fs from "fs";
import child_process from "child_process";

const internalPackagesFull = new Map();

// Find all the deps for the package
const getDepsForPackage = (packageName) => {
const packageDir = packageName.split("/")[1]; // THIS MAY NEED TO CHANGE FOR YOU
const packageSpecFileName = ../${packageDir}/package.json;
const packageSpecFile = fs.readFileSync(packageSpecFileName);
const packageSpec = JSON.parse(packageSpecFile);
const packageInternalDeps = Object.keys(
packageSpec.dependencies
).filter((key) => key.includes("turing")); // THIS WILL NEED TO CHANGE FOR YOU

const packageTgzName = ${packageName.replace("@", "").replace("/", "-")}-v${ packageSpec.version }.tgz;

internalPackagesFull.set(packageName, {
packageSpecFileName,
packageSpec,
packageDir,
packageInternalDeps,
packageTgzName,
});

const packagesToProcess = packageInternalDeps.filter(
(internalDepName) => !internalPackagesFull.has(internalDepName)
);

packagesToProcess.forEach((internalPackageName) =>
getDepsForPackage(internalPackageName)
);
};

const packageName = JSON.parse(fs.readFileSync("./package.json")).name;
child_process.execSync(cp ./package.json ./package.json.org);
getDepsForPackage(packageName);

// write updated packages - use common js and references are local tgz files
[...internalPackagesFull.values()].forEach((internalDep) => {
const { packageSpec, packageSpecFileName, packageInternalDeps } = internalDep;

// change the package type
packageSpec.type = "commonjs"; // THIS MAY NEED TO CHANGE FOR YOU

// specify the location of the dep to be the packaged zip file
packageInternalDeps.forEach((internalDepOfPackage) => {
const { packageTgzName } = internalPackagesFull.get(internalDepOfPackage);
packageSpec.dependencies[internalDepOfPackage] = ./${packageTgzName};
});

fs.writeFileSync(
packageSpecFileName,
JSON.stringify(packageSpec, null, " ")
);
});

// run yarn build and pack
[...internalPackagesFull.values()].forEach((internalDep) => {
try {
console.log(Buliding ${internalDep.packageDir});
child_process.execSync("yarn build", {
cwd: ../${internalDep.packageDir},
});
console.log(Packaging ${internalDep.packageDir});
child_process.execSync("yarn pack", {
cwd: ../${internalDep.packageDir},
});

if (packageName !== internalDep.packageSpec.name) {
  console.log(`Move to current directory ${internalDep.packageDir}`);
  child_process.execSync(
    `cp ../${internalDep.packageDir}/${internalDep.packageTgzName} .`,
    {
      cwd: ".",
    }
  );
}

} catch (e) {
console.log(e);
}
});

// move back to the standard packages structure
[...internalPackagesFull.values()]
.filter((internalDep) => packageName !== internalDep.packageSpec.name)
.forEach((internalDep) => {
const {
packageSpec,
packageSpecFileName,
packageInternalDeps,
} = internalDep;

// change the package type
packageSpec.type = "module"; // THIS MAY NEED TO CHANGE FOR YOU

// specify the location of the dep to be the packaged zip file
packageInternalDeps.forEach((internalDepOfPackage) => {
  packageSpec.dependencies[internalDepOfPackage] = "*";
});

fs.writeFileSync(
  packageSpecFileName,
  JSON.stringify(packageSpec, null, "  ")
);

});
```

I used the solution provided by @sowdri (thanks for that!), with a small tweak so that I can freely delete and regenerate the entire dist/ directory, including the secondary package.json:

const path = require('path');
const CopyPlugin = require('copy-webpack-plugin');

module.exports = {
  ...,
  plugins: [
    new CopyPlugin({
      patterns: [{ from: 'package.dist.json', to: 'package.json' }],
    }),
  ],
};

And then I keep the following package.dist.json in the package root:

{
  "name": "@package/name",
  "version": "0.0.1",
  "engines": {
    "node": "10"
  },
  "main": "index.js",
  "dependencies": {
    "firebase-admin": "8.9.1",
    "firebase-functions": "3.3.0"
  },
  "private": true
}

It might be possible to remove the dependencies from this file and rely on Webpack to bundle those in as well (removing the need to keep these dependencies in sync with your main package.json), but I haven't tried.

I using a single repository with a script to copy .firebaserc & firebase.json for all relevant directories or firebase projects using npm cpy package preserving the dir structure to output directory where my compiled code exists.

I create package.json from the original package.json for functions deployment.
image

here is the copyFiles.js script:

const cpy = require('cpy');
const fs = require('fs');
const package = require('./package.json');
(async () => {
    await cpy(['./../package.json', './**/*.json', './**/.firebaserc'], '../out/', {
        parents: true,
        cwd: 'src'
    });
    const dirs = fs.readdirSync('./out/');
    const newPkg = {
        main: package.main,
        dependencies: package.dependencies,
        engines: package.engines,
    }
    dirs.forEach(dir => {
        fs.writeFileSync(`./out/${dir}/package.json`, JSON.stringify({ name: dir, ...newPkg }));
    })
    console.log('Files copied!', dirs);
})();

for deployment go to ./out/[project-name]/firebase deploy --only=functions or write a script for it too.

I'm running into some Webpack warnings like this:

WARNING in /Users/me/Development/myproject/node_modules/firebase-functions/lib/config.js 61:23-42
Critical dependency: the request of a dependency is an expression

Did you find a way to solve these, or are you ignoring/suppressing them?

I managed to get things working without warnings 🥳 I ended up with the configuration below. Especially using the regex patterns for the externals made a difference because if you just have "firebase-functions" in your externals, any import that you make from a submodule will not be matched and the library is still included in your bundle.

As a result of bundling issues, I also ran into some cryptic @grpc errors when deploying. I forgot to keep them for reference.

const path = require("path");
const CopyPlugin = require("copy-webpack-plugin");

module.exports = {
  target: "node",
  mode: "production",
  entry: "./src/index.ts",
  devtool: "inline-source-map",
  module: {
    rules: [
      {
        test: /\.tsx?$/,
        use: "ts-loader",
        exclude: /node_modules/,
      },
    ],
  },
  resolve: {
    extensions: [".tsx", ".ts", ".js", ".json"],
    alias: {
      "~": path.resolve(__dirname, "src"),
    },
  },
  output: {
    filename: "index.js",
    path: path.resolve(__dirname, "dist/bundled"),
    libraryTarget: "commonjs",
  },
  externals: ["express", /^firebase.+$/, /^@google.+$/],
  plugins: [
    new CopyPlugin({
      patterns: [{ from: "package.dist.json", to: "package.json" }],
    }),
  ],
};

It turns out there is no need for a separate package.dist.json. The annoying thing of having this file is of course that you need to manually update it every time you update any of the dependencies listed there. So it is very easy to forget that.

Instead, move all of the packages that you _wouldn't_ list in your package.dist.json to the devDependencies list, and just use that file in the webpack copy plugin. Now you only have a single package.json to deal with 🎉

Also, I don't want my local nodejs version necessarily the same as the functions deployed node version. I found that you can now specify functions.runtime in the firebase.json file. So take out the engines field from the package.json and instead set functions.runtime to "10" or "12" in your firebase config.

This is what it looks like for me:

{
  "functions": {
    "source": "packages/cloud-functions/dist/bundled",
    "runtime": "12"
  },
  "firestore": {
    "rules": "firestore.rules",
    "indexes": "firestore.indexes.json"
  },
  "emulators": {
    "functions": {
      "port": 5001
    },
    "firestore": {
      "port": 8080
    },
    "pubsub": {
      "port": 8085
    }
  }
}

How are you dealing with source maps? If I bundle my functions with webpack using devtool: "inline-source-map" it doesn't get picked up in stackdriver error reporting. @sowdri

I ran into the same thing on my project and this was a pretty big bummer since I don't want to go through publishing every package for a side project. It was annoying keeping up with multiple package.json files or having to build outside of the package. Turns out that if you include your deps as optional Lerna will still pick them up and Firebase won't complain when it's uploading.

The config below will get you symlinked dep support with a single package.json!

package.json

{
  "name": "@your-package-name/functions",
  "version": "0.1.0",
  "scripts": {
    "build": "webpack"
  },
  "engines": {
    "node": "10"
  },
  "main": "dist/index.js",
  "dependencies": {
    "firebase-admin": "^8.10.0",
    "firebase-functions": "^3.6.1"
  },
  "optionalDependencies": {
    "@your-package-name/shared": "^0.1.0",
    "@your-package-name/utils": "^0.1.0"
  }
}

webpack.config.js

const path = require('path')

// The cost of being fancy I suppose
// https://github.com/firebase/firebase-tools/issues/653

module.exports = {
  target: 'node',
  mode: 'production',
  entry: './src/index.ts',
  module: {
    rules: [
      {
        test: /\.tsx?$/,
        loader: 'ts-loader',
        exclude: /node_modules/,
        options: {
          configFile: 'tsconfig.build.json',
        },
      },
    ],
  },
  resolve: {
    extensions: ['.tsx', '.ts', '.js', '.json'],
  },
  output: {
    filename: 'index.js',
    path: path.resolve(__dirname, 'dist'),
    libraryTarget: 'commonjs',
  },
  externals: {
    'firebase-admin': 'firebase-admin',
    'firebase-functions': 'firebase-functions',
  },
}

firebase.json

{
  "firestore": {
    "rules": "firestore.rules",
    "indexes": "firestore.indexes.json"
  },
  "functions": {
    "source": "packages/functions"
  },
  "emulators": {
    "functions": {
      "port": 5476
    },
    "firestore": {
      "port": 4565
    },
    "ui": {
      "enabled": true
    }
  }
}

I'm using yarn workspaces, but I also don't need to mention my local package names in other package.json files. I'm deploying to both Firebase and Vercel successfully without it.

I'm not sure what makes it work. Just have this standard config in my top-level package.json:

"workspaces": {
    "packages": [
      "packages/*"
    ]
  },

I don't mind having a package.json in each of the /packages/* folders, since running yarn upgrade-interactive will just handle them all in one go. In some packages I find it useful to be able to add script specifically for that scope.

---- edit ----

I forgot to mention I'm using Typescript with project references. That might have something to do with it. For Vercel I'm using next-transpile-modules to include my shared code in the bundle.

I'm using a similar set up to @0x80 and @sowdri to make this work, but rather than moving my local dependencies to devDependencies (which actually didn't work for me, think I was missing a step) and using copy-webpack-plugin, I'm using generate-package-json-webpack-plugin to build my package.json in the dist folder. This builds my dependency list from what the resulting code is actually requiring, so if it's bundled or a dev dependency it's not included in the resulting package.json.

I've also set up my externals using webpack-node-externals to make everything external except my symlinked monorepo packages which I'm using regex to match the project name prefix. I've also added the regex expressions for the firebase packages @0x80 posted as additional externals.

This is my config

/* eslint-disable @typescript-eslint/no-var-requires */
const path = require("path");
const nodeExternals = require("webpack-node-externals");
const GeneratePackageJsonPlugin = require("generate-package-json-webpack-plugin");

const basePackage = {
  name: "@project/functions",
  version: "1.0.0",
  main: "./index.js",
  scripts: {
    start: "yarn run shell"
  },
  engines: {
    node: "12"
  }
};

module.exports = {
  target: "node",
  mode: "production",
  entry: "./src/index.ts",
  devtool: "inline-source-map",
  module: {
    rules: [
      {
        test: /\.tsx?$/,
        use: "ts-loader",
        exclude: /node_modules/
      }
    ]
  },
  resolve: {
    extensions: [".tsx", ".ts", ".js", ".json"],
    alias: {
      "@": path.resolve(__dirname, "src"),
      "@root": path.resolve(__dirname, "./"),
      "@types": path.resolve(__dirname, "src/@types"),
      "@utils": path.resolve(__dirname, "src/utils")
    }
  },
  output: {
    filename: "index.js",
    path: path.resolve(__dirname, "dist"),
    libraryTarget: "commonjs"
  },
  externals: [
    /^firebase.+$/,
    /^@google.+$/,
    nodeExternals({
      allowlist: [/^@project/]
    })
  ],
  plugins: [new GeneratePackageJsonPlugin(basePackage)]
};

The other benefit about using webpack to bundle the code is I can finally use module aliases :)

Was this page helpful?
0 / 5 - 0 ratings