Note: You might also find helpful my notes NodeJS, NPM, and/or cross-env packages.
Building an NPM Package - Helpful Links and Resources
Click to expand resource section
- NPM guide
- Package.json specifics:
- Official NPM: Main Doc Page
- Official NPM: ESM Integration with Package.json
- Official Yarn: Main Doc Page
- Skypack Docs: Package Quality Score
- Scotch.io guide
- BetterStack: Publishing an NPM Package - Best Practices (comprehensive 👍, probably one of the best guides I have ever seen 💯)
- SenseDeep: How to Create a Hybrid NPM Module for ESM and CommonJS
- Codecentric / Bastian Krol: Cross Platform JavaScript with Browserify
- Also covers using Grunt as task-runner
- Webpack: "Authoring Libraries" Guide
- Example repo:
webpack-library-example
- Example repo:
- TypeScript Specific
- TS Handbook: Publishing Declaration Files
- ITNext: "Step by Step: Building and Publishing [...]
- Marco Botto: Compiling and Bundling [...] with Webpack
- @terabaud: My First TypeScript Library
- 2ality: Creating CommonJS-based NPM Packages via TypeScript
- loyc: How to publish a NPM Package (Properly)
- loyc: Setting up a TypeScript Project - 3 Approaches
- This is for a React project, but still has useful intro materials for any TS project
- Nokes: The 30 second guide to publishing [...]
- Kamalizade: Publishing to NPM with Rollup & TypeScript
Picking a Module Pattern for your NPM Package
📄 For understanding all the differences between different JS module patterns, I recommend checking out my separate cheatsheet on the topic.
⚠ The state of module support across Node versions, bundlers, and environments is complicated (and seemingly constantly changing)! I don't pretend to have all the answers, and defer to the experts on best practices, but summarize as best as I can the options you have as a package author below.
For writing the source code of your package:
- If you are using a bundler / transpiler, you have a pretty endless amount of options (see below, on distribution).
- If you are not bundling / transpiling your code, it used to be that you were supposed to hand-code
CommonJS
orUMD
for maximum compatibility across the most versions of Node. However, ESM (ESM / ES2015 / ES6 module style) is now supported across so many active versions of Node that it becoming safe to write your source code in ESM and have that be the only distribution method if you want the latest-and-greatest, without legacy support.
For distributing your code:
- the general consensus is / was that your main entry point (ala main) should point to either
CommonJS
, orUMD
(especially if trying to support cross-environment / isomorphic). This is for maximum compatibility; although ESM is now enabled without a flag in NodeJS, it has not yet reached critical saturation / adoption.- This is quickly changing; ESM is becoming well-adopted, and ESM-only packages might become the new normal pretty quickly. See section below for more details.
- Another option is dual package delivery - distribute both ESM and CJS in the same package. However, this comes with some unique issues of its own (see "Dual package hazard")
ESM Packages in NodeJS
ESM (ES Modules) has a lot of technical benefits over other formats, and many developers prefer the syntax as well. However, for a while it was not well supported in NodeJS, and thus most authors were advised against creating releases that only supplied consumers with ESM-only distribution code.
As of 2021, that era feels like it is about to come to a close and ESM is really becoming the norm. Both active LTS and the current version of Node at the time of writing this update (March 2021, active LTS being v14, and current being v15 - very soon to be v16) support ESM out of the box, no experimental warning. Further more, even Node v12, which doesn't reach EOL (end-of-life) until 4/30/2022, supports ESM without a CLI flag (assuming v >= 12.17.0, see release notes). So, unless you are trying to support NodeJS versions that are completely dead, you can pretty much* safely use and produce ESM-only packages.
* = "pretty much" is doing some heavy lifting. There is a lot of nuance around interoperability and legacy support that I am not getting into here. Also issues with module resolution that is unique to Node. Plus, losing
require
, which means you can't use(require.main === module)
for checking CLI.
In fact, there is the argument that as package authors, we actually have an ethical obligation to push adoption forward by embracing ESM-only packages to help convince consuming projects to upgrade Node versions and get ESM support. Some important reading on the topic by none other than Sindre:
- Sindre: "Get Ready for ESM"
- Sindre: Github Discussion - "The ESM move"
For a historic look at how we got here, some good resources are:
- blog.logrocket.com/es-modules-in-node-js-12-from-experimental-to-release
- NodeJS: "Plan for New Modules Implementation"
Be careful about transpiling to ESM for distribution; NodeJS has some particulars about module resolution that might not play nice with the default code generation of other systems. For example, TypeScripts compiler (TSC) will not automatically add file extensions to transpiled import paths, which break imports for consumers if you forget to explicitly use the file extension.
If you are ready to switch to a pure ESM package, Sindre has put out a quick-start guide / FAQ!
Module Pattern - Source Code vs Distribution
The reason why I made the emphasis on distribution vs source code, is because if you are using a transpiler / bundler, then it can do the automatic work of taking source code that is written in a newer, but less compatible module pattern (such as ESM), and transpile it to a more widely accepted pattern (such as UMD). This is a huge advantage, as something like UMD is messy to write by hand and track in source code, but can be auto-generated by your build tool.
In fact, you can even bundle and distribute multiple versions of your code with different module patterns, all in the same package, to support users who want / need legacy support, as well as those that want the newest module pattern. That's right; you can distribute UMD, CJS, and ESM in the same library, with the same package.json
file.
AFAIK, there are just a handful of ways to currently handle dual-package distribution, at least not without additional headaches.
- Use Conditional Export Maps
- This uses a newer feature, NodeJS Conditional Exports, and is probably the most future-proof. This section of the Node docs covers this strategy for delivering side-by-side ESM and CJS entries, as well as general tips on dealing with dual package delivery.
- Support across tooling outside of NodeJS, such as TSC, can be troublesome at the moment
- Use the
module
field- This isn't really officially supported by NodeJS (AFAIK), but has been supported by some of the most popular bundlers before ESM even landed in NodeJS!
- Rich Harris has a wonderful writeup about this, as part of the Rollup docs.
Note: With
package.exports
/ export maps / conditional exports, you still need to be really careful if you are aiming for backwards compatibility so CJS exists alongside for ESM. For example,main
should pretty much always point to thecjs
version if you are distributing one, since older versions of node don't understandpackage.exports
anyways.
NodeJS's docs section on "determining module system" is very helpful for understanding how NodeJS determines and resolves CJS vs ESM in a given imported package.
Breakdown of Different Package Entry Point Patterns (subject to change, these are based on what I'm aware of is common):
100% ESM
{
"type": "module",
"main": "./index.js"
}
100% ESM: using export map
{
"type": "module",
"exports": "./index.js"
}
This prevents subpaths from being exposed by default, which might actually be something you want for stronger control over exports to prevent breaking changes based on interfaces.
Dual Package via Exports Map
This is probably the most modern way (as of writing this) to write the entry-points / exports for a dual-package delivery system.
{
"type": "module",
"main": "./dist/output.cjs",
"exports": {
"import": "./dist/output.js", // or "dist/output.mjs"
"require": "./dist/output.cjs"
}
}
Dual Package: Via "module"
Warning: I believe this option is being phased out, in favor of the exports map approach.
pkg.module
appears to be a standard that was heavily adopted by bundlers (e.g. Rollup) and not NodeJS core, so although this might work if your consumers all use bundlers, it is probably better to use export maps to future proof.The quick replacement for this with exports maps it so replace
"module": "dist/output.esm.js"
with `
{
"main": "dist/output.cjs.js", // or "dist/output.umd.js"
"module": "dist/output.esm.js"
}
Or:
{
"type": "module",
"main": "dist/output.cjs",
"module": "dist/output.js" // or "dist/output.mjs"
}
Module Pattern Continued - Transpiled ESM
As previously mentioned, using CommonJS
as the module pattern in your source code offers maximum compatibility with the most versions of NodeJS (even dead versions). So, if you are really itching to write your source code with ESM, but also want to support CJS consumers, an option that is available to use is to deliver a pre-transpiled CJS version as the final version to support all users (alongside the ESM code for those that can use it).
Delivering both CJS and ESM in the same distribution is often referred to as a dual package distribution
Note: Even if you are already using a bundler, you might have to add another tool / step to transform the ESM (and or
ES6
stuff); often you need to separately addtranspiling
as a step in-between source code and bundling. Or, your bundler might have an option to specify the use of a tool likebabel
in the input options / config directly.
A very common pattern is to keep all your ESM
code in a subdirectory, such as /src
. Then, as your build step, you transpile to /lib
. So, to summarize, the common folder structure is:
/src
- Contains your original code, with
ESM
(and / or other less compatible stuff) - Tracked in VC, optionally omitted in NPM upload
- Contains your original code, with
/lib
- Contains the transpiled version of
/src
- Usually generated by
babel
, and via yourbuild
command - Probably don't want to track in VC, but should include in distribution
- Contains the transpiled version of
/dist
- OPTIONAL: Contains minified / uglified / compressed version of
/lib
- If building a cross-environment (Node + Browser) package, this would probably contain the browser-specific final code (see my cross-env package guide for details).
- Like
/lib
, usually don't want to track in VC, but should be included in distribution - The reason why this is usually generated by optimizing
/lib
and not/src
, is because many compressors (such asuglifyjs
) are not compatible with the newest JS features, so their input has to be transpiled anyways; feeding it/lib
is saving an unnecessary duplicate transpiling- If your
/src
folder is already compatible with most modern versions of Node (or to where you feel comfortable) and compatible with your minifier / optimizer, then you could just minify directly from/src
to/dist
- If your
- OPTIONAL: Contains minified / uglified / compressed version of
You can choose to deliver both CJS alongside your original ESM code, or only the final CJS version. If you opt to only include the final CJS version, using the folder structure from above, you would simply exclude the /src
directory from the NPM upload, and make absolutely sure the entry point is inside the /dist
folder.
If you are delivering both at the same time, see above section on distribution strategies for details on how to handle your entry-points within
package.json
A benefit to delivering ESM is that it works with tree-shaking, and allows ESM consumers to avoid messy CJS interop syntax.
Transpiling
If you want to use bleeding-edge JS features, or other code that might be natively unsupported by most versions of Node but are supported by a transpiler, you can do so. The normal pattern for doing this is to keep all your code in a subdirectory (e.g. /src
) and transpile to another directory (e.g. /dist
). You then exclude /dist
from version control, but include it in the NPM upload and set your entry point inside it.
My instructions here are very similar to my section on using ES Modules in a package's source code, so check that out if you are looking for a more in-depth explanation.
Distribution Paths and Module Resolution - The Root Directory Issue
Disclaimer: I'll be honest, this part of package development continues to confuse me and always seems convoluted and overly-complicated 😒. It took me considerable effort to gather the resources for this section, which should not have been the case.
In general, it seems like Node and NPM still have a big problem with nested source file resolution, even when package.main
and/or package.exports
is explicitly set to point to a nested target. I'm loosely referring to this as the "root directory" issue, but as you'll see in my "background" section, there have been multiple proposals to address this which have fallen under different names. This can actually be a really problematic and frustrating issue, and it makes me angry 😡 that so many guides seem to gloss over this, or leave the issue to bundling without discussing how it gets solved...
For example, assuming this structure for my-lib
:
- package.json
- /src
- /index.js
- /validators.js
- /constants (dir)
- /index.js
- /dist (auto-generated)
- /index.d.ts
- /index.js
- /validators.js
- /validators.d.ts
- /constants (dir)
- /index.js
- /index.d.ts
If you have package.main
point to dist/index.js
, you would expect that in the consuming project, you could write validators = require('my-lib/validators)
... but that is not the case. That will throw an error, unless you change it to require('my-lib/dist/validators')
even though main is pointing inside dist
already!
Background on Root Directory Issues with NodeJS
The answer to why this is the case is complicated, and mostly has to do with module resolution, and how the default in node
is to resolve to the root of the package inside node_modules
, where package.json
is located. If you are curious, here are some relevant links that help explain this:
- S/O Response to "how to publish [...] without dist in import"
- NodeJS Docs: "Loading from node_modules folder" (explains module resolution algorithm)
Here is the creator, (and founder + former CEO) of NPM, writing in 2013 about why this functionality was not added, and why he felt it should not be.
There have also been multiple discussions on either changing the default behavior, or adding a field to package.json
to specify adding / modifying the default root directory.
- nodejs/node: Issue #14970 - Request to add
mainDir
topackage.json
- FINALLY, here are a bunch of people talking openly and clearly about this being a common issue and needing better solutions
- nodejs/node: Issue #3953 - Request to rework
main
entry- This exact same request was also communicated on npm/npm: Issue #10506
- nodejs/node: Issue #21787 - Request for a
package.json
root / base directory - microsoft/TypeScript: Issue #21423 - Request for multiple main entries /
mainField
Approaches for Working Around NodeJS's Root Directory Issue
Here is probably what you are looking for; how have other developers approached this issue? How can you workaround it? Read below:
Different Approaches
- Map the subdirectory(s) via
package.exports
and conditional exports (only for versions of node supporting ESM or--experimental-modules
flag)- This is basically the newest solution, and is gaining in adoption. However, this really only solves the issue in scenarios where both the library author and library consumer are using ES6 / ES Modules.
- There is now a docs page on this
- Example implementations:
- trusktr comment on #14970 (shows exports mixed with main, as fallback)
- jkrems comment on #21787 (maps to different subdirs)
- This is definitely the way of the future!
- I think this is an example of a correct syntax to export all sub-paths (even deep) within
/dist
:{ "exports": { ".": "./dist/index.js", "./*": "./dist/*.js" } }
- WARNING: For TypeScript users, this is currently (as of 4/1/2021) not supported. Main issue is issue #33079, and support should make it into the
v4.3
release on May 25th.
- Import and re-export everything from main entry (e.g.
index
)- This is a very common pattern; your package has one main export, but might let users access sub-items by either nested methods and/or nested objects (via destructuring).
const {subItemA, subItemB} = require('my-lib');
const main = require('my-lib'); main.subMethodA(); main.subMethodB();
- This allows for destructuring and tree-shaking, but not slashed import / require paths that don't adhere to the source file structure
- Examples:
- This is a very common pattern; your package has one main export, but might let users access sub-items by either nested methods and/or nested objects (via destructuring).
- Keep everything in project root
- Pros: Keeps everything simple, requires less build tooling
- Cons: Can make root of project messy, source files mixed with config files, etc.
- It looks like this is how
sindresorhus
builds most of his (non-TS) libraries, for example,slugify
- Copy everything to subdirectory, or send build output there (e.g.
/dist
), copypackage.json
to subdirectory, ensure proper paths inpackage.json
, and runnpm publish
from subdirectory- This is basically pretending that things are the same as if you had just put all your files in the project root, with a flat structure
- This could be accomplished with a hand-coded Node script, bash scripting, and/or build tools, like
gulp
- Examples:
- Best practices: How do you avoid accidentally publishing from the root instead of
/dist
?- Safest option: Make root
package.json
haveprivate: true
, but remove that declaration in version that is copied to/dist
- Safest option: Make root
- Add files to the root that simple re-export from the subdirectory
- This is similar to the "import and re-export everything from index.js" solution
- If you want to expose
my-lib/subdir/alpha.js
, you would create a file in the root,my-lib/alpha.js
, which simply re-exports:module.exports = require('./subdir/alpha.js
) - Examples:
- Keep
package.json
directly in subdirectory instead of root (or a different package.json file than root), and publish from there- Very similar to the "copy everything to subdirectory and publish there" approach, but the difference is that you keep
package.json
in the subdirectory to start with (as well as any other "root" level files) - In general, I think this is one of the worst approaches; it not conventional, is confusing to those browsing NPM vs those browsing actual source code, and seems error-prone
- If you are leaving a
package.json
file in the root directory, make sure you have"private": true
, so you don't accidentally publish the entire thing - Examples:
- Very similar to the "copy everything to subdirectory and publish there" approach, but the difference is that you keep
Relevant resources:
- TypeScript Handbook: Module Resolution
Creating Cross-Environment NPM Packages
I've put together a separate page for some summary information on how to craft and release a cross-environment (aka isomorphic, Node + Browser) package - you can find it here.
Exposing a CLI
You can specify that a specific file will receive CLI commands (instead of index.js), by using the bin
field in your package.json.
If you want the command trigger (e.g. what someone types in their CLI to hit your package) to just be your package name, then bin
can just be the path to the file:
{
"bin": "cli.js"
}
But if you want to expose multiple commands, make bin
an object with the commands as fields:
{
"bin": {
"myapp-foo": "cli-foo.js",
"myapp-bar": "cli-bar.js"
}
}
Make sure that all files exposed through
bin
start with#!/usr/bin/env node
, otherwise they are not executed with thenode
executable!
Warning: If you are using
npm link
for local development, you might need to re-runnpm link
every time after modifyingbin
in order for the changes to be picked up globally. At least, I had to in my case.
See docs for details.
NodeJS - CLI Framework Libraries
Hand-building a CLI can be labor intensive and error-prone, since you end up dealing with a lot of variable inputs, string parsing, and random ordered arguments. There are some reputable libraries to help with building a CLI.
Starting with Node 18.3, there is now a built-in tool for arg parsing -
parseArgs
fromnode:util
. It is not as full-featured as a dedicated CLI library likecmd-ts
, but is good enough to work for quite a few more simple use-cases.
If you use TypeScript, I just recently tried out and really enjoyed using cmd-ts
. It offers a lot more type-safety than competing libraries (such as Oclif).
- Oclif
- Gluegun
- commander.js
- commandpost
- Caporal.js
- cac
- sindresorhus/meow
- ... and many more! Don't re-invent the wheel, when you don't have to.
Comparison post: https://www.nexmo.com/legacy-blog/2020/06/12/comparing-cli-building-libraries
NodeJS - CLI UI Libraries
For fancier CLI output (spinners, loaders, task lists, etc.), you will probably want to use a library:
- Checklists: Listr
- Spinners
- Colors
- Multiple:
Returning an Exit Code / Boolean to the CLI
You might have noticed that a lot of build and dev-ops stuff uses multiple commands chained together, often with logical operators and exit codes. For example:
package.json
:
{
"scripts": {
"run": "npm run build && npm run serve",
"build": "___",
"serve": "___"
}
}
With the above, if build throws an error (and/or returns anything other than 0
as the exit code), the npm run serve
part does not run.
To return an exit code from your NodeJS script, simply use process.exit(errorCode)
. Just remember that 0
means success, and any other number represents an error!
Exporting Types
Important resources:
- TypeScript Handbook: Publishing Declaration Files
- BetterStack: Publishing TypeScript Definitions with an NPM Package
Warning: If you use
declarationDir
to emit types to a separate types folder instead of alongside JS files (e.g. setting it to./dist/types
) then you are probably going to run into issues on the importing side. It is far easier to just emit the declaration files side-by-side with the generated JS (either omitdeclarationDir
, or have it matchoutDir
).
Including Ambient Declarations in Compiled Types
It is a common misconception that TSC will copy your ambient declaration files and/or compile them into your output folder. This is not the case, and intentionally so.
Your options around this are to either:
- Rewrite
*.d.ts
ambient files as regular*.ts
files that useexport
to share types with other files- Or...
- Manually copy your
*.d.ts
file(s) fromsrc
todist
(or whatever your export folder is)- If you use this option, use
shx cp
for better cross-OS compatibility in your copy command - If you have a flat directory (no nested
src
ordist
), you could also avoid copying by just explicitly include the files viapackage.json -> files
(for example, that is used in sindresorhus/meow).
- If you use this option, use
For more details, check out the "Ambient Declarations" section of my TypeScript notes.
Forcing a Single d.ts Declaration File Bundle
You might be tempted to see if you can create a single *.d.ts
file for your entire package; it can be a little scary to see TSC literally double the file count in your dist
output, because it is creating a declaration file for each generated *.js
file.
However, in general this is either A) not possible, or B) not really necessary.
There is not a huge amount of benefit (that I know of) to having a single declaration file; as long as you point package.json -> types
to the declaration file that mirrors your JS entry-point, the IDE will be able to traverse all the generated declaration files, no matter how many there are.
I'm not saying there are no benefits to bundling to a single declaration file; just that they often outweigh the added complexity and fragility introduced by forcing a non-standard bundling process.
If you really want this, there are some options available:
- Native TSC support:
- This was kind of implemented in #5090, but **only for
amd
orsystem
modules - Possibly could be implemented at some point - proposal: #4433
- If you are only using TSC for emitting types for JS (i.e., with
emitDeclarationOnly
on), you can actually get this to work; just useoutFile
- This was kind of implemented in #5090, but **only for
- Alternative tools and bundlers:
- See mentions in the #4433 proposal thread
- rollup-plugin-dts
- dts-bundle-generator
- You literally cannot produce a bundled single
*.d.ts
declaration file with TSC if you are using anything other thanamd
orsystem
- this will not work with
commonjs
- Possibly could be implemented at some point - proposal: #4433
- Manual scripts and string manipulation
- this will not work with
Local Package Testing and Loading
To try out your NPM package, you can link it to the project directory where you want to import the local version for trial; below are some options for how to accomplish this, as well as some tips:
- BEST:
yalc
CLI!!! (🙏)- Fast, syntax is similar to
npm / yarn
, much better emulation of how actual distributed package works - I have a cheat sheet / notes page for it
- Fast, syntax is similar to
yarn link
ornpm link
- Pros: Fast, since it uses symbolic links to point to source package location
- Cons: Inconsistencies in how it works, stale data, issues with module resolution
install
topackage.json
with local relative path (file:
syntax)- Under dependencies, this might look something like
"my-lib": "file:../../my-lib"
- This works with both
yarn add
andnpm install
- Under dependencies, this might look something like
- Using custom webpack resolvers (
resolve
config)- Usually done by using
alias
, and then mapping to local path; see official docs and this example
- Usually done by using
Private Distributed Testing and Loading
Symlink based tools work great for local testing, but what if you want to share a private development version of your package with coworkers, or projects hosted in other environments?
Although you can publish packages as private within an organization with a paid account, the easiest way to quickly share a development build of a package is by sharing the generated tarball.
Steps:
- Generate the tarball archive by using
npm pack
in the root of your package to share- You can use
npm pack --dry-run
to preview what will be included in the package - Make sure the output
.tgz
file(s) are excluded from git and future NPM package builds - Filename of final
.tgz
will be shown in console after command is run
- You can use
- Get the path of the tarball for sharing
- A) If you are going to be sharing this across devices / networks, you need to host the tarball somewhere where others can access it. Options are endless in this regard; any host that supports public hotlinking will work.
- If you are using Github for your project, you could upload the tarball to an issue or release to get a hosted URL
- Some file sharing sites, like mediafire, support hotlinking
- You can even upload the tarball directly in your console, using transfer.sh
- You could also share the tarball with coworkers via Slack / Email / etc. - and they could download it to use locally (see step 3)
- B): If you are going to be sharing this locally, you just need the filepath of the generated
*.tgz
. Get it, and copy it somewhere for easy reference.- Although NPM supports using relative paths, it is safest to just get the full path
- A) If you are going to be sharing this across devices / networks, you need to host the tarball somewhere where others can access it. Options are endless in this regard; any host that supports public hotlinking will work.
- In the project where you want to use the private package, add the previously generated tarball URL / path as a dependency
- Neat: These methods work for both URLs, and local paths
- For local paths, use posix style (
/
, forward slashes) to avoid issues - For URLs, always include the
http(s)://
prefix
- For local paths, use posix style (
- You can use the CLI to add it, with
npm install {tarball_URL_or_Path}
, oryarn add {tarball_URL_or_Path}
- Alternatively, you can add the URL or path directly, by simply using:
"my-package": "https://fake-file-host-not-real.com/my-package-1.0.0.tgz"
"my-package": "C:/temp/my-package/my-package-1.0.0.tgz"
- Neat: These methods work for both URLs, and local paths
WARNING: With a hosted public URL (no login), this method is not truly 100% private. If you need something to rival NPM's paid private package hosting (which is completely private), then take a look at using private git repos as dependencies.
This approach is also suitable for local testing, if you need a replacement for the symlink based options.
Publishing
You can use npm pack --dry-run
or npm publish --dry-run
to get a preview of what files would be published. Or just npm pack
to generate a local tarball (.tgz
).
To quickly get a preview of package size, you can use:
npm pack --dry-run 2>&1 | grep "size"
If you want to exclude files from being distributed as part of your package, you have a few options:
- Use
.gitignore
file- Downside: this also prevents files from showing up on Github / tracked in Git
- Not a good solution for preventing large files that you still want tracked in Git
- Use
.npmignore
file- NOT RECOMMENDED, for several reasons:
- Conflicts with
.gitignore
file - NPM will not even look at that file if.npmignore
exists. - This is dangerous because if you exclude a password file in your
.gitignore
but forget to duplicate the rule to your.npmignore
file, it will be included in your package!
- Conflicts with
- NOT RECOMMENDED, for several reasons:
- Use
package.json
->"files": []
(docs)- This is the best option
- You can move all the files you want to include into a separate folder, and then only include that folder. For example:
{ "files": [ "src/" ] }
- Syntax can be
folder/
orfolder
, but don't prefix folder names with./
(e.g. no./src
) - You can also use globs
Warning: Make sure you don't have any entries pointing at files/directories that are excluded from publishing! E.g., if you are only publishing
./dist
, then make sure you don't accidentally have an entry likebin: "./src/cli.js"
Misc Tips
- Be aware of built-in hooks, like
install
, as well as allpre
/post
hooks - Know about "scoped packages"
- You can publish names with slashes in them, like
@joshuatz/right-pad
- These by default are scoped to private
- use
npm publish --access=public
to publish public - Thanks to this post for the tip
- You can publish names with slashes in them, like
- Don't forget about
npm link
for local testing- In package folder
npm link
- In other project you want to test the package with
npm link {packageName}
- I personally prefer (and recommend) yalc as a better alternative for local testing
- In package folder
- If you are making both a CLI and a regular module, don't forget to have both fields:
main
(points to index.js, or whatever your main module file is)bin
(points to the cli.js, or whatever your CLI file(s) are)
- Just because you see a field in someone elses
package.json
file, doesn't mean it is natively supported by the spec; it might be something special to be consumed by their build system- For example, the
browser
entry field (defines different entry file for browser vs Node) is not part of the official spec, but is supported natively by many of the most popular build tools, such as Webpack and Browserify
- For example, the
- You can find a hosted JSON Schema file for
package.json
here - The package
shx
is great for cross-OS commands directly inpackage.scripts
- e.g.
"clean": "shx rm -rf dist && shx mkdir dist"
- e.g.
Meta: Packages that Help with Package Building
Package | What It Does |
---|---|
yalc | Streamlines local package usage / testing (alternative to npm link ) |
agadoo | Verifies that the final module code you generate is tree-shakeable |
np | CLI (both interactive and non-interactive) to help with crafting a new package release. Does things like checking you are on the correct branch, reinstalling dependencies and running tests, and automatically opening a pre-filled release draft on Github after publish. |
semantic-release | Tool for bumping versions and helping with publishing pipeline. |
Also relevant: Elevator Pitches - Bundler
Troubleshooting
- Error:
cannot find module '___'
when trying to import a nested source file from your published / built package- Are you trying to import or require a file with a slashed path that does not exactly match the published package? Node's module resolution maps the root folder to the package's root in
node_modules
, regardless of wherepackage.main
points to. - See my section on "The Root Directory Issue"
- Are you trying to import or require a file with a slashed path that does not exactly match the published package? Node's module resolution maps the root folder to the package's root in
- Error:
File __.d.ts is not a module
when trying to export / consume a declaration file that you generated (TypeScript)- Make sure you didn't combine
module: 'commonjs'
withoutFile
; these are incompatible (useoutDir
instead)
- Make sure you didn't combine