# Deno Documentation - Full Content > This document contains the full content of the Deno documentation website. # Compressing response bodies URL: https://docs.deno.com/deploy/api/compression Compressing the response body to save bandwidth is a common practice. To take some work off your shoulder, we built the capabilities directly into Deploy. Deno Deploy supports brotli and gzip compression. Compression is applied when the following conditions are met. 1. The request to your deployment has [`Accept-Encoding`][accept-encoding] header set to either `br` (brotli) or `gzip`. 2. The response from your deployment includes the [`Content-Type`][content-type] header. 3. The provided content type is compressible; we use [this database](https://github.com/jshttp/mime-db/blob/master/db.json) to determine if the content type is compressible. 4. The response body size is greater than 20 bytes. When Deploy compresses the response body, it will set `Content-Encoding: gzip` or `Content-Encoding: br` header to the response based on the compression algorithm used. ### When is compression skipped? Deno Deploy skips the compression if: - The response has [`Content-Encoding`][content-encoding] header. - The response has [`Content-Range`][content-range] header. - The response's [`Cache-Control`][cache-control] header has [`no-transform`][no-transform] value (e.g. `cache-control: public, no-transform`). ### What happens to my `Etag` header? When you set an Etag header with the response, we convert the header value to a Weak Etag if we apply compression to your response body. If it is already a Weak Etag, we don't touch the header. [accept-encoding]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Accept-Encoding [cache-control]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control [content-encoding]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Encoding [content-type]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Type [no-transform]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control#other [content-range]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Range --- # Dynamic import URL: https://docs.deno.com/deploy/api/dynamic-import Deno Deploy supports [dynamic import] but with some limitations. This page outlines these limitations. ### Specifiers must be statically determined string literals In the usual dynamic import, specifiers don't need to be determined at build time. So all of the following forms are valid: ```ts title="Valid dynamic imports in Deno CLI" // 1. Statically determined string literal await import("jsr:@std/assert"); // 2. Statically determined, but via variable const specifier = "jsr:@std/assert"; await import(specifier); // 3. Statically determined, but template literal const stdModuleName = "path"; await import(`jsr:@std/${stdModuleName}`); // 4. Dynamically determined const rand = Math.random(); const mod = rand < 0.5 ? "npm:cowsay" : "npm:node-emoji"; await import(mod); ``` In Deno Deploy, however, specifiers must be string literals with no string interpolation. So among the three examples above, only the first one works in Deno Deploy. ```ts title="Only static string literals work in Deno Deploy" // 1. ✅ Works fine on Deno Deploy await import("jsr:@std/assert"); // 2. ❌ Doesn't work on Deno Deploy // because what's passed to `import` is a variable const specifier = "jsr:@std/streams"; await import(specifier); // 3. ❌ Doesn't work on Deno Deploy // because this has an interpolation const stdModuleName = "path"; await import(`jsr:@std/${stdModuleName}`); // 4. ❌ Doesn't work on Deno Deploy // because it's dynamic const rand = Math.random(); const mod = rand < 0.5 ? "npm:cowsay" : "npm:node-emoji"; await import(mod); ``` ### One exception - dynamic specifiers work for same project files Specifiers that are dynamically determined are supported if target files (modules) are included in the same project. ```ts title="Dynamic specifiers work for files in the same project" // ✅ Works fine on Deno Deploy await import("./my_module1.ts"); // ✅ Works fine on Deno Deploy const rand = Math.random(); const modPath = rand < 0.5 ? "dir1/moduleA.ts" : "dir2/dir3/moduleB.ts"; await import(`./${modPath}`); ``` Note that template literals starting with `./` tell the module resolver that the target module is in the same project. Conversely, if a specifier does not start with `./`, the possible target modules will not be included the resulting [eszip], causing dynamic imports to fail at runtime, even if the final evaluated specifier starts with `./`. ```ts // ❌ Doesn't work because the analyzer can't statically determine if the // specifier starts with `./` or not in this case. // Compare this to the previous example. Only difference is whether to put // `./` in the template literal or in the variable. const rand = Math.random(); const modPath = rand < 0.5 ? "./dir1/moduleA.ts" : "./dir2/dir3/moduleB.ts"; await import(modPath); ``` We will consider if we can relax this constraint in the future. :::tip What is eszip? When you do a new deployment on Deno Deploy, the system analyzes your code, constructs the module graph by recursively traversing it, and bundles all the dependencies into a single file. We call this [eszip](https://github.com/denoland/eszip). Since its creation is done completely statically, dynamic import capabilities are limited on Deno Deploy. ::: ### Data URLs [Data URL] can be used as a specifier passed to dynamic imports. ```ts title="Static data URL" // ✅ Works fine on Deno Deploy const { val } = await import( "data:text/javascript,export const val = 42;" ); console.log(val); // -> 42 ``` For data URLs, fully dynamic data is supported. ```ts title="Dynamic data URL" function generateDynamicDataUrl() { const moduleStr = `export const val = ${Math.random()};`; return `data:text/javascript,${moduleStr}`; } // ✅ Works fine on Deno Deploy const { val } = await import(generateDynamicDataUrl()); console.log(val); // -> Random value is printed ``` Applying this technique to JavaScript code fetched from the web, you can even simulate a true dynamic import: ```js title="external.js" export const name = "external.js"; ``` ```ts title="Dynamic data URL from fetched source" import { assert } from "jsr:@std/assert/assert"; const res = await fetch( "https://gist.githubusercontent.com/magurotuna/1cacb136f9fd6b786eb8bbad92c8e6d6/raw/56a96fd0d246fd3feabbeecea6ea1155bdf5f50d/external.js", ); assert(res.ok); const src = await res.text(); const dataUrl = `data:application/javascript,${src}`; // ✅ Works fine on Deno Deploy const { name } = await import(dataUrl); console.log(`Hello from ${name}`); // -> "Hello from external.js" ``` However, note that data URL given to `import` has to be JavaScript; TypeScript, when passed, throws a [TypeError] at runtime. [dynamic import]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import [eszip]: https://github.com/denoland/eszip [Data URL]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URLs [TypeError]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypeError --- # API Reference URL: https://docs.deno.com/deploy/api/ This is a reference for runtime APIs available on Deno Deploy. This API is very similar to the standard [runtime API](/runtime/manual/runtime), but some APIs are not available in the same way, given that Deno Deploy is a serverless environment. Please use this section of the documentation to explore available APIs on Deno Deploy. ### Web APIs - [`console`](https://developer.mozilla.org/en-US/docs/Web/API/console) - [`atob`](https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/atob) - [`btoa`](https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/btoa) - [Fetch API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) - `fetch` - `Request` - `Response` - `URL` - `File` - `Blob` - [TextEncoder](https://developer.mozilla.org/en-US/docs/Web/API/TextEncoder) - [TextDecoder](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder) - [TextEncoderStream](https://developer.mozilla.org/en-US/docs/Web/API/TextEncoderStream) - [TextDecoderStream](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoderStream) - [Performance](https://developer.mozilla.org/en-US/docs/Web/API/Performance) - [Web Crypto API](https://developer.mozilla.org/en-US/docs/Web/API/Crypto) - `randomUUID()` - `getRandomValues()` - [SubtleCrypto](https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto) - [WebSocket API](https://developer.mozilla.org/en-US/docs/Web/API/WebSocket) - [Timers](https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/setTimeout) (`setTimeout`, `clearTimeout`, and `setInterval`) - [Streams API](https://developer.mozilla.org/en-US/docs/Web/API/Streams_API) - `ReadableStream` - `WritableStream` - `TransformStream` - [URLPattern API](https://developer.mozilla.org/en-US/docs/Web/API/URLPattern) - [Import Maps](https://docs.deno.com/runtime/manual/basics/import_maps/) - Note: `import maps` are currently only available via [deployctl](https://github.com/denoland/deployctl) or [deployctl GitHub Action](https://github.com/denoland/deployctl/blob/main/action/README.md) workflows. ### Deno APIs > Note: only stable APIs of Deno are made available in Deploy. - [`Deno.env`](https://docs.deno.com/api/deno/~/Deno.env) - Interact with environment variables (secrets). - `get(key: string): string | undefined` - get the value of an environment variable. - `toObject(): { [key: string]: string }` - get all environment variables as an object. - [`Deno.connect`](https://docs.deno.com/api/deno/~/Deno.connect) - Connect to TCP sockets. - [`Deno.connectTls`](https://docs.deno.com/api/deno/~/Deno.connectTls) - Connect to TCP sockets using TLS. - [`Deno.startTls`](https://docs.deno.com/api/deno/~/Deno.startTls) - Start TLS handshake from an existing TCP connection. - [`Deno.resolveDns`](https://docs.deno.com/api/deno/~/Deno.resolveDns) - Make DNS queries - File system API - [`Deno.cwd`](https://docs.deno.com/api/deno/~/Deno.cwd) - Get the current working directory - [`Deno.readDir`](https://docs.deno.com/api/deno/~/Deno.readDir) - Get directory listings - [`Deno.readFile`](https://docs.deno.com/api/deno/~/Deno.readFile) - Read a file into memory - [`Deno.readTextFile`](https://docs.deno.com/api/deno/~/Deno.readTextFile) - Read a text file into memory - [`Deno.open`](https://docs.deno.com/api/deno/~/Deno.open) - Open a file for streaming reading - [`Deno.stat`](https://docs.deno.com/api/deno/~/Deno.stat) - Get file system entry information - [`Deno.lstat`](https://docs.deno.com/api/deno/~/Deno.lstat) - Get file system entry information without following symlinks - [`Deno.realPath`](https://docs.deno.com/api/deno/~/Deno.realPath) - Get the real path of a file after resolving symlinks - [`Deno.readLink`](https://docs.deno.com/api/deno/~/Deno.readLink) - Get the target path for the given symlink ## Future support In the future, these APIs will also be added: - [Cache API](https://developer.mozilla.org/en-US/docs/Web/API/Cache) - UDP API: - `Deno.connectDatagram` for outbound UDP sockets - Customizable `fetch` options using `Deno.createHttpClient` ## Limitations Just like the Deno CLI, we do not implement the `__proto__` object field as specified in ECMA Script Annex B. --- # BroadcastChannel URL: https://docs.deno.com/deploy/api/runtime-broadcast-channel In Deno Deploy, code is run in different data centers around the world in order to reduce latency by servicing requests at the data center nearest to the client. In the browser, the [`BroadcastChannel`](https://developer.mozilla.org/en-US/docs/Web/API/Broadcast_Channel_API) API allows different tabs with the same origin to exchange messages. In Deno Deploy, the BroadcastChannel API provides a communication mechanism between the various instances; a simple message bus that connects the various Deploy instances worldwide. ## Constructor The `BroadcastChannel()` constructor creates a new `BroadcastChannel` instance and connects to (or creates) the provided channel. ```ts let channel = new BroadcastChannel(channelName); ``` #### Parameters | name | type | description | | ----------- | -------- | --------------------------------------------------------- | | channelName | `string` | The name for the underlying broadcast channel connection. | The return type of the constructor is a `BroadcastChannel` instance. ## Properties | name | type | description | | ---------------- | ---------------------- | ------------------------------------------------------------------------------------------------------------ | | `name` | `string` | The name of the underlying broadcast channel. | | `onmessage` | `function` (or `null`) | The function that's executed when the channel receives a new message ([`MessageEvent`][messageevent]). | | `onmessageerror` | `function` (or `null`) | The function that's executed when the arrived message cannot be deserialized to a JavaScript data structure. | ## Methods | name | description | | ---------------------- | ---------------------------------------------------------------------------------------------------------------------------------- | | `close()` | Close the connection to the underlying channel. After closing, you can no longer post messages to the channel. | | `postMessage(message)` | Post a message to the underlying channel. The message can be a string, object literal, a number or any kind of [`Object`][object]. | `BroadcastChannel` extends [`EventTarget`][eventtarget], which allows you to use methods of `EventTarget` like `addEventListener` and `removeEventListener` on an instance of `BroadcastChannel`. ## Example: Update an in-memory cache across instances One use case for a message bus like the one enabled by `BroadcastChannel` is updating an in-memory cache of data between isolates running in different data centers across the network. In the example below, we show how you can configure a simple server that uses `BroadcastChannel` to synchornize state across all running instances of the server. ```ts import { Hono } from "jsr:@hono/hono"; // in-memory cache of messages const messages = []; // A BroadcastChannel used by all isolates const channel = new BroadcastChannel("all_messages"); // When a new message comes in from other instances, add it channel.onmessage = (event: MessageEvent) => { messages.push(event.data); }; // Create a server to add and retrieve messages const app = new Hono(); // Add a message to the list app.get("/send", (c) => { // New messages can be added by including a "message" query param const message = c.req.query("message"); if (message) { messages.push(message); channel.postMessage(message); } return c.redirect("/"); }); // Get a list of messages app.get("/", (c) => { // Return the current list of messages return c.json(messages); }); Deno.serve(app.fetch); ``` You can test this example yourself on Deno Deploy using [this playground](https://dash.deno.com/playground/broadcast-channel-example). [eventtarget]: https://developer.mozilla.org/en-US/docs/Web/API/EventTarget [messageevent]: https://developer.mozilla.org/en-US/docs/Web/API/MessageEvent [object]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object --- # HTTP requests (fetch) URL: https://docs.deno.com/deploy/api/runtime-fetch The [Fetch API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) allows you to make outbound HTTP requests in Deno Deploy. It is a web standard and has the following interfaces: - `fetch()` - The method that allows you to make outbound HTTP requests - [`Request`](./runtime-request) - represents a request resource of fetch() - [`Response`](./runtime-response) - represents a response resource of fetch() - [`Headers`](./runtime-headers) - represents HTTP Headers of requests and responses. This page shows usage for the fetch() method. You can click above on the other interfaces to learn more about them. Fetch also supports fetching from file URLs to retrieve static files. For more info on static files, see the [filesystem API documentation](./runtime-fs). ## `fetch()` The `fetch()` method initiates a network request to the provided resource and returns a promise that resolves after the response is available. ```ts function fetch( resource: Request | string, init?: RequestInit, ): Promise; ``` #### Parameters | name | type | optional | description | | -------- | ------------------------------------------------------------- | -------- | ------------------------------------------------------------------ | | resource | [`Request`](./runtime-request)
[`USVString`][usvstring] | `false` | The resource can either be a request object or a URL string. | | init | [`RequestInit`](./runtime-request#requestinit) | `true` | The init object lets you apply optional parameters to the request. | The return type of `fetch()` is a promise that resolves to a [`Response`](./runtime-response). ## Examples The Deno Deploy script below makes a `fetch()` request to the GitHub API for each incoming request, and then returns that response from the handler function. ```ts async function handler(req: Request): Promise { const resp = await fetch("https://api.github.com/users/denoland", { // The init object here has an headers object containing a // header that indicates what type of response we accept. // We're not specifying the method field since by default // fetch makes a GET request. headers: { accept: "application/json", }, }); return new Response(resp.body, { status: resp.status, headers: { "content-type": "application/json", }, }); } Deno.serve(handler); ``` [usvstring]: https://developer.mozilla.org/en-US/docs/Web/API/USVString --- # File system APIs URL: https://docs.deno.com/deploy/api/runtime-fs Deno Deploy supports a limited set of the file system APIs available in Deno. These file system APIs can access static files from your deployments. Static files are for example: - The files in your GitHub repository, if you deploy via the GitHub integration. - The entrypoint file in a playground deployment. The APIs that are available are: - [Deno.cwd](#deno.cwd) - [Deno.readDir](#deno.readdir) - [Deno.readFile](#deno.readfile) - [Deno.readTextFile](#deno.readtextfile) - [Deno.open](#deno.open) - [Deno.stat](#deno.stat) - [Deno.lstat](#deno.lstat) - [Deno.realPath](#deno.realpath) - [Deno.readLink](#deno.readlink) ## Deno.cwd `Deno.cwd()` returns the current working directory of your deployment. It is located at the root of your deployment's root directory. For example, if you deployed via the GitHub integration, the current working directory is the root of your GitHub repository. ## Deno.readDir `Deno.readDir()` allows you to list the contents of a directory. The function is fully compatible with [Deno](https://docs.deno.com/api/deno/~/Deno.readDir). ```ts function Deno.readDir(path: string | URL): AsyncIterable ``` The path can be a relative or absolute. It can also be a `file:` URL. ### Example This example lists the contents of a directory and returns this list as a JSON object in the response body. ```js async function handler(_req) { // List the posts in the `blog` directory located at the root // of the repository. const posts = []; for await (const post of Deno.readDir(`./blog`)) { posts.push(post); } // Return JSON. return new Response(JSON.stringify(posts, null, 2), { headers: { "content-type": "application/json", }, }); } Deno.serve(handler); ``` ## Deno.readFile `Deno.readFile()` allows you to read a file fully into memory. The function definition is similar to [Deno](https://docs.deno.com/api/deno/~/Deno.readFile), but it doesn't support [`ReadFileOptions`](https://docs.deno.com/api/deno/~/Deno.ReadFileOptions) for the time being. Support will be added in the future. ```ts function Deno.readFile(path: string | URL): Promise ``` The path can be a relative or absolute. It can also be a `file:` URL. ### Example This example reads the contents of a file into memory as a byte array, then returns it as the response body. ```js async function handler(_req) { // Let's read the README.md file available at the root // of the repository to explore the available methods. // Relative paths are relative to the root of the repository const readmeRelative = await Deno.readFile("./README.md"); // Absolute paths. // The content of the repository is available under at Deno.cwd(). const readmeAbsolute = await Deno.readFile(`${Deno.cwd()}/README.md`); // File URLs are also supported. const readmeFileUrl = await Deno.readFile( new URL(`file://${Deno.cwd()}/README.md`), ); // Decode the Uint8Array as string. const readme = new TextDecoder().decode(readmeRelative); return new Response(readme); } Deno.serve(handler); ``` > Note: to use this feature, you must link a GitHub repository to your project. Deno Deploy supports the `Deno.readFile` API to read static assets from the file system. This is useful for serving static assets such as images, stylesheets, and JavaScript files. This guide demonstrates how to use this feature. Imagine the following file structure on a GitHub repository: ```console ├── mod.ts └── style.css ``` The contents of `mod.ts`: ```ts async function handleRequest(request: Request): Promise { const { pathname } = new URL(request.url); // This is how the server works: // 1. A request comes in for a specific asset. // 2. We read the asset from the file system. // 3. We send the asset back to the client. // Check if the request is for style.css. if (pathname.startsWith("/style.css")) { // Read the style.css file from the file system. const file = await Deno.readFile("./style.css"); // Respond to the request with the style.css file. return new Response(file, { headers: { "content-type": "text/css", }, }); } return new Response( `

Example

`, { headers: { "content-type": "text/html; charset=utf-8", }, }, ); } Deno.serve(handleRequest); ``` The path provided to the [`Deno.readFile`](https://docs.deno.com/api/deno/~/Deno.readFile) API is relative to the root of the repository. You can also specify absolute paths, if they are inside `Deno.cwd`. ## Deno.readTextFile This function is similar to [Deno.readFile](#Deno.readFile) except it decodes the file contents as a UTF-8 string. ```ts function Deno.readTextFile(path: string | URL): Promise ``` ### Example This example reads a text file into memory and returns the contents as the response body. ```js async function handler(_req) { const readme = await Deno.readTextFile("./README.md"); return new Response(readme); } Deno.serve(handler); ``` ## Deno.open `Deno.open()` allows you to open a file, returning a file handle. This file handle can then be used to read the contents of the file. See [`Deno.File`](#deno.file) for information on the methods available on the file handle. The function definition is similar to [Deno](https://docs.deno.com/api/deno/~/Deno.open), but it doesn't support [`OpenOptions`](https://docs.deno.com/api/deno/~/Deno.OpenOptions) for the time being. Support will be added in the future. ```ts function Deno.open(path: string | URL): Promise ``` The path can be a relative or absolute. It can also be a `file:` URL. ### Example This example opens a file, and then streams the content as the response body. ```js async function handler(_req) { // Open the README.md file available at the root of the repository. const file = await Deno.open("./README.md"); // Use the `readable` property, which is a `ReadableStream`. This will // automatically close the file handle when the response is done sending. return new Response(file.readable); } Deno.serve(handler); ``` :::note When you iterate over a file stream as shown below, the file descriptor will be automatically closed at the end of iteration. There is no need to manually close the file descriptor: `const iterator = fd.readable[Symbol.asyncIterator]();` ::: ## Deno.File `Deno.File` is a file handle returned from [`Deno.open()`](#deno.open). It can be used to read chunks of the file using the `read()` method. The file handle can be closed using the `close()` method. The interface is similar to [Deno](https://docs.deno.com/api/deno/~/Deno.File), but it doesn't support writing to the file, or seeking. Support for the latter will be added in the future. ```ts class File { readonly rid: number; close(): void; read(p: Uint8Array): Promise; } ``` The path can be a relative or absolute. It can also be a `file:` URL. ## Deno.File#read() The read method is used to read a chunk of the file. It should be passed a buffer to read the data into. It returns the number of bytes read or `null` if the end of the file has been reached. ```ts function read(p: Uint8Array): Promise; ``` ### Deno.File#close() The close method is used to close the file handle. Closing the handle will interrupt all ongoing reads. ```ts function close(): void; ``` ## Deno.stat `Deno.stat()` reads a file system entry's metadata. It returns a [`Deno.FileInfo`](#fileinfo) object. Symlinks are followed. The function definition is the same as [Deno](https://docs.deno.com/api/deno/~/Deno.stat). It does not return modification time, access time, or creation time values. ```ts function Deno.stat(path: string | URL): Promise ``` The path can be a relative or absolute. It can also be a `file:` URL. ### Example This example gets the size of a file, and returns the result as the response body. ```js async function handler(_req) { // Get file info of the README.md at the root of the repository. const info = await Deno.stat("./README.md"); // Get the size of the file in bytes. const size = info.size; return new Response(`README.md is ${size} bytes large`); } Deno.serve(handler); ``` ## Deno.lstat `Deno.lstat()` is similar to `Deno.stat()`, but it does not follow symlinks. The function definition is the same as [Deno](https://docs.deno.com/api/deno/~/Deno.lstat). It does not return modification time, access time, or creation time values. ```ts function Deno.lstat(path: string | URL): Promise ``` The path can be a relative or absolute. It can also be a `file:` URL. ## Deno.FileInfo The `Deno.FileInfo` interface is used to represent a file system entry's metadata. It is returned by the [`Deno.stat()`](#deno.stat) and [`Deno.lstat()`](#deno.lstat) functions. It can represent either a file, a directory, or a symlink. In Deno Deploy, only the file type, and size properties are available. The size property behaves the same way it does on Linux. ```ts interface FileInfo { isDirectory: boolean; isFile: boolean; isSymlink: boolean; size: number; } ``` ## Deno.realPath `Deno.realPath()` returns the resolved absolute path to a file after following symlinks. The function definition is the same as [Deno](https://docs.deno.com/api/deno/~/Deno.realPath). ```ts function Deno.realPath(path: string | URL): Promise ``` The path can be a relative or absolute. It can also be a `file:` URL. ### Example This example calls `Deno.realPath()` to get the absolute path of a file in the root of the repository. The result is returned as the response body. ```ts async function handler(_req) { const path = await Deno.realPath("./README.md"); return new Response(`The fully resolved path for ./README.md is ${path}`); } Deno.serve(handler); ``` ## Deno.readLink `Deno.readLink()` returns the target path for a symlink. The function definition is the same as [Deno](https://docs.deno.com/api/deno/~/Deno.readLink). ```ts function Deno.readLink(path: string | URL): Promise ``` The path can be a relative or absolute. It can also be a `file:` URL. ### Example This example calls `Deno.readLink()` to get the absolute path of a file in the root of the repository. The result is returned as the response body. ```ts async function handler(_req) { const path = await Deno.readLink("./my_symlink"); return new Response(`The target path for ./my_symlink is ${path}`); } Deno.serve(handler); ``` --- # HTTP Headers URL: https://docs.deno.com/deploy/api/runtime-headers The [Headers](https://developer.mozilla.org/en-US/docs/Web/API/Headers) interface is part of the Fetch API. It allows you create and manipulate the HTTP headers of request and response resources of fetch(). - [Constructor](#constructor) - [Parameters](#parameters) - [Methods](#methods) - [Example](#example) ## Constructor The Header() constructor creates a new `Header` instance. ```ts let headers = new Headers(init); ``` #### Parameters | name | type | optional | description | | ---- | --------------------------------------- | -------- | ------------------------------------------------------------------------------------------------------- | | init | `Headers` / `{ [key: string]: string }` | `true` | The init option lets you initialize the headers object with an existing `Headers` or an object literal. | The return type of the constructor is a `Headers` instance. ## Methods | name | description | | ------------------------------------- | ----------------------------------------------------------------- | | `append(name: string, value: string)` | Appends a header (overwrites existing one) to the Headers object. | | `delete(name: string)` | Deletes a header from the Headers object. | | `set(name: string, value: string)` | Create a new header in the Headers object. | | `get(name: string)` | Get the value of the header in the Headers object. | | `has(name: string)` | Check if the header exists in the Headers objects. | | `entries()` | Get the headers as key-value pair. The result is iterable. | | `keys()` | Get all the keys of the Headers object. The result is iterable. | ## Example ```ts // Create a new headers object from an object literal. const myHeaders = new Headers({ accept: "application/json", }); // Append a header to the headers object. myHeaders.append("user-agent", "Deno Deploy"); // Print the headers of the headers object. for (const [key, value] of myHeaders.entries()) { console.log(key, value); } // You can pass the headers instance to Response or Request constructors. const request = new Request("https://api.github.com/users/denoland", { method: "POST", headers: myHeaders, }); ``` --- # Node.js built-in APIs URL: https://docs.deno.com/deploy/api/runtime-node Deno Deploy natively supports importing built-in Node.js modules like `fs`, `path`, and `http` through `node:` specifiers. This allows running code originally written for Node.js without changes in Deno Deploy. Here is an example of a Node.js HTTP server running on Deno Deploy: ```js import { createServer } from "node:http"; import process from "node:process"; const server = createServer((req, res) => { const message = `Hello from ${process.env.DENO_REGION} at ${new Date()}`; res.end(message); }); server.listen(8080); ``` _You can see this example live here: https://dash.deno.com/playground/node-specifiers_ When using `node:` specifiers, all other features of Deno Deploy are still available. For example, you can use `Deno.env` to access environment variables even when using Node.js modules. You can also import other ESM modules from external URLs as usual. The following Node.js modules are available: - `assert` - `assert/strict` - `async_hooks` - `buffer` - `child_process` - `cluster` - `console` - `constants` - `crypto` - `dgram` - `diagnostics_channel` - `dns` - `dns/promises` - `domain` - `events` - `fs` - `fs/promises` - `http` - `http2` - `https` - `module` - `net` - `os` - `path` - `path/posix` - `path/win32` - `perf_hooks` - `process` - `punycode` - `querystring` - `readline` - `stream` - `stream/consumers` - `stream/promises` - `stream/web` - `string_decoder` - `sys` - `timers` - `timers/promises` - `tls` - `tty` - `url` - `util` - `util/types` - `v8` - `vm` - `worker_threads` - `zlib` The behavior of these modules should be identical to Node.js in most cases. Due to the sandboxing behaviour of Deno Deploy, some features are not available: - Executing binaries with `child_process` - Spawning workers using `worker_threads` - Creating contexts and evaluating code with `vm` > Note: the emulation of Node.js modules is sufficient for most use cases, but > it is not yet perfect. If you encounter any issues, please > [open an issue](https://github.com/denoland/deno). --- # HTTP Request URL: https://docs.deno.com/deploy/api/runtime-request The [Request](https://developer.mozilla.org/en-US/docs/Web/API/Request) interface is part of the Fetch API and represents the request of fetch(). - [Constructor](#constructor) - [Parameters](#parameters) - [Properties](#properties) - [Methods](#methods) - [Example](#example) ## Constructor The Request() constructor creates a new Request instance. ```ts let request = new Request(resource, init); ``` #### Parameters | name | type | optional | description | | -------- | ----------------------------- | -------- | ------------------------------------------------------------------------- | | resource | `Request` or `USVString` | `false` | The resource can either be a request object or a URL string. | | init | [`RequestInit`](#requestinit) | `true` | The init object lets you set optional parameters to apply to the request. | The return type is a `Request` instance. ##### `RequestInit` | name | type | default | description | | ---------------------------- | --------------------------------------------------------------------------------------- | -------------- | ---------------------------------------------------------- | | [`method`][method] | `string` | `GET` | The method of the request. | | [`headers`][headers] | `Headers` or `{ [key: string]: string }` | none | Th Headers for the request. | | [`body`][body] | `Blob`, `BufferSource`, `FormData`, `URLSearchParams`, `USVString`, or `ReadableStream` | none | The body of the request. | | [`cache`][cache] | `string` | none | The cache mode of the request. | | [`credentials`][credentials] | `string` | `same-origin` | The credentials mode of the request. | | [`integrity`][integrity] | `string` | none | The crypotographic hash of the request's body. | | [`mode`][mode] | `string` | `cors` | The request mode you want to use. | | [`redirect`][redirect] | `string` | `follow` | The mode of how redirects are handled. | | [`referrer`][referrer] | `string` | `about:client` | A `USVString` specifying `no-referrer`, `client` or a URL. | ## Properties | name | type | description | | ---------------------------------- | ------------------------------------------ | ---------------------------------------------------------------------------------------------------------------------------- | | [`cache`][cache] | `string` | The cache mode indicates how the (`default`, `no-cache`, etc) request should be cached by browser. | | [`credentials`][credentials] | `string` | The credentials (`omit`, `same-origin`, etc) indicate whether user agent should send cookies in case of CORs of the request. | | [`destination`][destination] | [`RequestDestination`][requestdestination] | The string indicates the type of content being requested. | | [`body`][body] | [`ReadableStream`][readablestream] | The getter exposes a `ReadableStream` of the body contents. | | [`bodyUsed`][bodyused] | `boolean` | Indicates whether the body content is read. | | [`url`][url] | `USVString` | The URL of the request. | | [`headers`][headers] | [`Headers`](runtime-headers) | The headers associated with the request. | | [`integrity`][integrity] | `string` | The crypotographic hash of the request's body. | | [`method`][method] | `string` | The request's method (`POST`, `GET`, etc). | | [`mode`][mode] | `string` | Indicates the mode of the request (e.g. `cors` ). | | [`redirect`][redirect] | `string` | The mode of how redirects are handled. | | [`referrer`][referrer] | `string` | The referrer of the request. | | [`referrerPolicy`][referrerpolicy] | `string` | The referrer policy of the request | All the above properties are read only. ## Methods | name | description | | ------------------------------ | ------------------------------------------------------------------------------------------- | | [`arrayBuffer()`][arraybuffer] | Reads the body stream to its completion and returns an `ArrayBuffer` object. | | [`blob()`][blob] | Reads the body stream to its completion and returns a `Blob` object. | | [`formData()`][formdata] | Reads the body stream to its completion and returns a `FormData` object. | | [`json()`][json] | Reads the body stream to its completion, parses it as JSON and returns a JavaScript object. | | [`text()`][text] | Reads the body stream to its completion and returns a USVString object (text). | | [`clone()`][clone] | Clones the Request object. | ## Example ```ts function handler(_req) { // Create a post request const request = new Request("https://post.deno.dev", { method: "POST", body: JSON.stringify({ message: "Hello world!", }), headers: { "content-type": "application/json", }, }); console.log(request.method); // POST console.log(request.headers.get("content-type")); // application/json return fetch(request); } Deno.serve(handler); ``` [cache]: https://developer.mozilla.org/en-US/docs/Web/API/Request/cache [credentials]: https://developer.mozilla.org/en-US/docs/Web/API/Request/credentials [destination]: https://developer.mozilla.org/en-us/docs/web/api/request/destination [requestdestination]: https://developer.mozilla.org/en-US/docs/Web/API/RequestDestination [body]: https://developer.mozilla.org/en-US/docs/Web/API/Body/body [bodyused]: https://developer.mozilla.org/en-US/docs/Web/API/Body/bodyUsed [url]: https://developer.mozilla.org/en-US/docs/Web/API/Request/url [headers]: https://developer.mozilla.org/en-US/docs/Web/API/Request/headers [method]: https://developer.mozilla.org/en-US/docs/Web/API/Request/method [integrity]: https://developer.mozilla.org/en-US/docs/Web/API/Request/integrity [mode]: https://developer.mozilla.org/en-US/docs/Web/API/Request/mode [redirect]: https://developer.mozilla.org/en-US/docs/Web/API/Request/redirect [referrer]: https://developer.mozilla.org/en-US/docs/Web/API/Request/referrer [referrerpolicy]: https://developer.mozilla.org/en-US/docs/Web/API/Request/referrerpolicy [readablestream]: https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream [arraybuffer]: https://developer.mozilla.org/en-US/docs/Web/API/Body/arrayBuffer [blob]: https://developer.mozilla.org/en-US/docs/Web/API/Body/blob [json]: https://developer.mozilla.org/en-US/docs/Web/API/Body/json [text]: https://developer.mozilla.org/en-US/docs/Web/API/Body/text [formdata]: https://developer.mozilla.org/en-US/docs/Web/API/Body/formdata [clone]: https://developer.mozilla.org/en-US/docs/Web/API/Request/clone --- # HTTP Response URL: https://docs.deno.com/deploy/api/runtime-response The [Response](https://developer.mozilla.org/en-US/docs/Web/API/Response) interface is part of the Fetch API and represents a response resource of fetch(). - [Constructor](#constructor) - [Parameters](#parameters) - [Properties](#properties) - [Methods](#methods) - [Example](#example) ## Constructor The Response() constructor creates a new Response instance. ```ts let response = new Response(body, init); ``` #### Parameters | name | type | optional | description | | ---- | --------------------------------------------------------------------------------------- | -------- | -------------------------------------------------------------------------- | | body | `Blob`, `BufferSource`, `FormData`, `ReadableStream`, `URLSearchParams`, or `USVString` | `true` | The body of the response. The default value is `null`. | | init | `ResponseInit` | `true` | An optional object that allows setting status and headers of the response. | The return type is a `Response` instance. ##### `ResponseInit` | name | type | optional | description | | ------------ | ----------------------------------------------------- | -------- | ----------------------------------------------------- | | `status` | `number` | `true` | The status code of the response. | | `statusText` | `string` | `true` | The status message representative of the status code. | | `headers` | `Headers` or `string[][]` or `Record` | `false` | The HTTP headers of the response. | ## Properties | name | type | read only | description | | -------------------------- | ---------------- | --------- | ----------------------------------------------------------- | | [`body`][body] | `ReadableStream` | `true` | The getter exposes a `ReadableStream` of the body contents. | | [`bodyUsed`][bodyused] | `boolean` | `true` | Indicates whether the body content is read. | | [`url`][url] | `USVString` | `true` | The URL of the response. | | [`headers`][headers] | `Headers` | `true` | The headers associated with the response. | | [`ok`][ok] | `boolean` | `true` | Indicates if the response is successful (200-299 status). | | [`redirected`][redirected] | `boolean` | `true` | Indicates if the response is the result of a redirect. | | [`status`][status] | `number` | `true` | The status code of the response | | [`statusText`][statustext] | `string` | `true` | The status message of the response | | [`type`][type] | `string` | `true` | The type of the response. | ## Methods | name | description | | ---------------------------------------------------- | ------------------------------------------------------------------------------------------- | | [`arrayBuffer()`][arraybuffer] | Reads the body stream to its completion and returns an `ArrayBuffer` object. | | [`blob()`][blob] | Reads the body stream to its completion and returns a `Blob` object. | | [`formData()`][formdata] | Reads the body stream to its completion and returns a `FormData` object. | | [`json()`][json] | Reads the body stream to its completion, parses it as JSON and returns a JavaScript object. | | [`text()`][text] | Reads the body stream to its completion and returns a USVString object (text). | | [`clone()`][clone] | Clones the response object. | | [`error()`][error] | Returns a new response object associated with a network error. | | [`redirect(url: string, status?: number)`][redirect] | Creates a new response that redirects to the provided URL. | ## Example ```ts function handler(_req) { // Create a response with html as its body. const response = new Response(" Hello ", { status: 200, headers: { "content-type": "text/html", }, }); console.log(response.status); // 200 console.log(response.headers.get("content-type")); // text/html return response; } Deno.serve(handler); ``` [clone]: https://developer.mozilla.org/en-US/docs/Web/API/Response/clone [error]: https://developer.mozilla.org/en-US/docs/Web/API/Response/error [redirect]: https://developer.mozilla.org/en-US/docs/Web/API/Response/redirect [body]: https://developer.mozilla.org/en-US/docs/Web/API/Body/body [bodyused]: https://developer.mozilla.org/en-US/docs/Web/API/Body/bodyUsed [url]: https://developer.mozilla.org/en-US/docs/Web/API/Request/url [headers]: https://developer.mozilla.org/en-US/docs/Web/API/Request/headers [ok]: https://developer.mozilla.org/en-US/docs/Web/API/Response/ok [redirected]: https://developer.mozilla.org/en-US/docs/Web/API/Response/redirected [status]: https://developer.mozilla.org/en-US/docs/Web/API/Response/status [statustext]: https://developer.mozilla.org/en-US/docs/Web/API/Response/statusText [type]: https://developer.mozilla.org/en-US/docs/Web/API/Response/type [method]: https://developer.mozilla.org/en-US/docs/Web/API/Request/method [readablestream]: https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream [arraybuffer]: https://developer.mozilla.org/en-US/docs/Web/API/Body/arrayBuffer [blob]: https://developer.mozilla.org/en-US/docs/Web/API/Body/blob [json]: https://developer.mozilla.org/en-US/docs/Web/API/Body/json [text]: https://developer.mozilla.org/en-US/docs/Web/API/Body/text [formdata]: https://developer.mozilla.org/en-US/docs/Web/API/Body/formdata --- # TCP sockets and TLS URL: https://docs.deno.com/deploy/api/runtime-sockets Deno Deploy supports outbound TCP and TLS connections. These APIs allow you to use databases like PostgreSQL, SQLite, MongoDB, etc., with Deploy. Looking for information on _serving_ TCP? Take a look at the documentation for [`Deno.serve`](/api/deno/~/Deno.serve) including its support for [TCP options](/api/deno/~/Deno.ServeTcpOptions). ## `Deno.connect` Make outbound TCP connections. The function definition is same as [Deno](https://docs.deno.com/api/deno/~/Deno.connect) with the limitation that `transport` option can only be `tcp` and `hostname` cannot be localhost or empty. ```ts function Deno.connect(options: ConnectOptions): Promise ``` ### Example ```js async function handler(_req) { // Make a TCP connection to example.com const connection = await Deno.connect({ port: 80, hostname: "example.com", }); // Send raw HTTP GET request. const request = new TextEncoder().encode( "GET / HTTP/1.1\nHost: example.com\r\n\r\n", ); const _bytesWritten = await connection.write(request); // Read 15 bytes from the connection. const buffer = new Uint8Array(15); await connection.read(buffer); connection.close(); // Return the bytes as plain text. return new Response(buffer, { headers: { "content-type": "text/plain;charset=utf-8", }, }); } Deno.serve(handler); ``` ## `Deno.connectTls` Make outbound TLS connections. The function definition is the same as [Deno](https://docs.deno.com/api/deno/~/Deno.connectTls) with the limitation that hostname cannot be localhost or empty. ```ts function Deno.connectTls(options: ConnectTlsOptions): Promise ``` ### Example ```js async function handler(_req) { // Make a TLS connection to example.com const connection = await Deno.connectTls({ port: 443, hostname: "example.com", }); // Send raw HTTP GET request. const request = new TextEncoder().encode( "GET / HTTP/1.1\nHost: example.com\r\n\r\n", ); const _bytesWritten = await connection.write(request); // Read 15 bytes from the connection. const buffer = new Uint8Array(15); await connection.read(buffer); connection.close(); // Return the bytes as plain text. return new Response(buffer, { headers: { "content-type": "text/plain;charset=utf-8", }, }); } Deno.serve(handler); ``` --- # Deno Deployᴱᴬ changelog > Listing notable progress in the development and evolution of Deno Deploy Early Access URL: https://docs.deno.com/deploy/early-access/changelog :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: ## June 24th, 2025 ### Features - The playground now has live streaming logs and traces panels - Logs and traces for the current revision are displayed for the past hour - Logs and traces can be filtered, just like in the dedicated observability pages - Framework auto-detection now works for more projects out of the box, including many Vite-based projects - The organization dropdown now highlights the currently selected organization more clearly ### Bug fixes - The sparklines in the metrics overview are now working correctly - The error rate metric now functions properly - GitHub-triggered builds no longer run multiple times - Next.js builds now work more reliably on older Next.js versions ## June 12th, 2025 ### Features - Deno DeployEA now supports playgrounds! - Playgrounds can be created and accessed from the playgrounds tab in the organizations overview - Playgrounds can contain multiple files and include build steps - The playground UI features an iframe to preview your deployed app - Three templates are currently available: hello world, Next.js, and Hono - On mobile devices, there is now a floating navbar that doesn't intrude into page content ## June 9th, 2025 ### Features - Deno DeployEA has a new logo! - Anyone can now join Early Access by signing up at [dash.deno.com](https://dash.deno.com/account#early-access) - Builds - Builds can now use up to 8 GB of storage, up from 2 GB - Builds can now use environment variables and secrets configured in the organization or app settings (in the new "Build" context) - Builds now have a maximum runtime of 5 minutes - The metrics page has had a complete overhaul, by rewriting the chart rendering: - Dragging on a graph now zooms in on the selected area - Much more data can now be shown without the page becoming slow to load - The tooltip now follows the mouse cursor, together with a new crosshair that allows for precise analysis - Font sizes and colors have been improved for better readability ### Bug fixes - Builds should not get stuck in a pending state anymore - Dashboard pages now load significantly faster - Correctly show spans in traces that have parents that are not exported (yet) - The metrics page correctly refreshes now when switching time ranges - The "Clear search" button in the telemetry search bar now works correctly - Older Next.js versions (such as Next.js 13) build correctly now - The environment variable drawer is now used everywhere, fixing a bug where multiple env vars with the same name but different contexts would conflict - Running `node ` in the builder does not fail anymore when the path is absolute - `npx` is now available in the builder - Astro builds will not sporadically fail with `--unstable-vsock` errors anymore - Svelte projects now deploy correctly when a project explicitly specifies `@deno/svelte-adapter` ## May 26th, 2025 ### Features - When triggering a manual build you can now choose which branch to deploy - You can now deploy Astro static sites without having to manually install the Deno adapter - There are now [reference docs for you to peruse](https://docs.deno.com/deploy/early-access/). ### Bug fixes - SvelteKit auto detection now works when using `npm` as the package manager - Prewarming does not trigger random POST requests to your app anymore - Visiting a page with a trailing slash will not 404 anymore - Drawers will no longer close if you click inside, hold and drag over the backdrop, and release ## May 22nd, 2025 ### Features - You can now bulk import env vars during app creation by pasting a `.env` file into the env var drawer - SvelteKit now works out of the box without manually installing the Deno adapter - A preset for the Lume static site generator is now available ### Bug fixes - Environment variables now show up correctly on the timelines page - The production timeline page now correctly shows all builds - app.deno.com works on older versions of Firefox now - Page titles across app.deno.com now reflect the page you are on - The "Provision certificate" button does not lock up after DNS verification failures anymore - Domains that had a provisioned certificate or attached application can now be deleted --- # Getting started > Step-by-step guide to creating and configuring your first Deno Deploy Early Access application, including organization setup, build configuration, environment variables, and deployment monitoring. URL: https://docs.deno.com/deploy/early-access/getting_started :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: :::note Deno DeployEA is in private beta. To use Deno Deploy EA you must join the Early Access program from the [Deploy Classic account settings page](https://dash.deno.com/account#early-access). ::: ## Create an organization To get started with Deno DeployEA: 1. Visit [app.deno.com](http://app.deno.com) 2. Create an organization: ![The Deno DeployEA organization creation screen.](./images/create_org.png) Note that you cannot create an organization with the same slug as any existing project in Deploy Classic. Organization names and slugs cannot be changed after creation. ## Create an app After creating an organization, you'll be directed to the organization apps page, which shows all your applications and provides access to organization settings and custom domains. To create an app, press the `+ New App` button: ![Screenshot of deploy app creation screen](./images/create_app.png) An application is a single deployed web service with one build configuration, build history, environment variables, attached custom domains, a linked GitHub repository, etc. ## Select a repo 1. Choose the GitHub repository for your application: ![Screenshot of deploy org selection screen](./images/select_org.png) If your repository doesn't appear, use the `Add another GitHub account` or `Configure GitHub App permissions` buttons to grant the Deno Deploy GitHub app access to your repositories. > ⏳ Mono-repos (repositories where the application lives in a subdirectory) are > not yet supported. ## Configure your app Deno DeployEA automatically attempts to detect your application type and configure an appropriate build setup. You can see the detected configuration in the `App Config` box: ![Screenshot of Deploy application configuration screen](./images/app_config.png) To modify this configuration, click `Edit build config`. ![Screenshot of Deploy build configuration screen](./images/build_config.png) ## Configure your build In the build config drawer, you can customize: ### Framework preset Select your framework or choose `No Preset` if using a custom setup. ### Install command Command for installing dependencies (e.g., `npm install`, `deno install`). This can be empty for Deno applications without a `package.json`. ### Build command Command to compile/bundle your application (e.g., `next build`, `deno task build`). Leave empty if your application doesn't require building. ### Runtime configuration For most frameworks there are no options to configure here, as Deno Deploy EA will figure out the ideal runtime configuration for the app based on the framework preset. When a framework is not configured, you can choose here whether the app is a `Dynamic` app that needs to execute code server side for every request, such as an API server, server-side rendered application, etc., or a `Static` app that consists only of a set of static files that need to be hosted. ### Dynamic Entrypoint The JavaScript or TypeScript file that should be executed to start the application. This is the file path that you would pass locally to `deno run` or `node` to start the app. The path has to be relative to the working directory. ### Dynamic arguments Additional command line arguments to pass to the app on startup, after the entrypoint. These are arguments that are passed to the application not to Deno itself. ### Static Directory The directory in the working directory that contains the static files to be served. For example,`dist`,`_site`, or`.output`. ### Single Page App mode Whether the application is a single page app that should have the root `index.html` served for any paths that do not exist as files in the static directory, instead of a 404 page. Closing the drawer saves the settings. ### Environment variables To add environment variables: 1. Click `Add/Edit environment variables` 2. Click `+ Add variable` in the drawer 3. Enter the name and value 4. Choose whether it's a plain text variable or secret 5. Select the contexts where it should be available: - **Production**: For requests to production domains - **Development**: For requests to preview/branch domains 6. Click `Save` to apply your changes ![Screenshot of the Deploy env variables config screen](./images/env_var.png) ## Build and deploy your app 1. Click `Create App` to create the application and start the first build 2. Watch the build progress through the live logs: ![Screenshot of app build logs](./images/build_logs.png) The build logs show these stages: - **Prepare**: Cloning the repository and restoring caches - **Install**: Running the install command and framework-specific setup - **Build**: Executing the build command and preparing the deployment artifact - **Warm up**: Testing the deployment with a request - **Route**: Deploying the build to global regions You can cancel a build with the button in the top-left corner, or restart failed builds from the same location. After completion, the top-right shows the preview URL, and below that, all timelines where the build is deployed. ## Monitor your application After deploying, use the observability tools to monitor your application: ### Logs View application logs with filtering options for context, revision, and text content: ![Screenshot of the Logs page](./images/logs.png) Use the search bar to filter logs (e.g., `context:production`, `revision:`). The time picker adjusts the displayed time range. If a log is associated with a trace, you can click "View trace" to see the corresponding trace information. ### Traces View request traces with detailed timing information: ![Screenshot of the Traces page](./images/traces.png) Click any trace to open the trace view showing all spans in a waterfall visualization: ![Screenshot of the Trace view](./images/trace.png) The trace view shows: - Timeline of spans with duration - Span details including attributes - Logs emitted during the span To save the environment variables, press the save button. You can re-open the drawer to edit / remove environment variables you have added. You can also edit the app name on this page, and select which region(s) the application should be served from. ## Build and deploy your app Finally, you can press the `Create App` button to create the app. This will create the app and immediately trigger the first build: ![Screenshot of app build logs](./images/build_logs.png) On the build page you can see live streaming build logs split into multiple sections: - **Prepare:** cloning the GitHub repository and restoring build cache - **Install:** executing the install command, and any framework specific pre-install setup - **Build:** executing the build command, any framework specific pre- and post-build setup, and preparing the build artifact for deployment - **Warm up:** sending a request to the preview URL of the deployment to ensure it starts up correctly. The logs shown in the Warm up section are Runtime logs, not build logs. - **Route:** Deno Deploy is rolling out the new version of this build into all global regions. In the top left of this build is a button to cancel the build. For failed builds, there is also a button to restart the build. For completed builds, the top right shows the preview URL of the build. Further down all timelines that this build is deployed to are shown, such as `Production`, or `Git Branch` timelines. You can also see how the build was triggered on this page. This can either be `manual action`, for builds triggered through the UI, or `GitHub repo` for builds triggered through the GitHub integration. You can view the application through either the preview URL, or any of the other URLs shown in the timelines list. ## Monitor your application After visiting your application, you can view telemetry about your application in the form of the logs and traces available in our observability panels. You can visit these pages by clicking the respective buttons in the left sidebar. ### Logs ![Screenshot of the Logs page](./images/logs.png) The logs page shows all recent logs in the project. By default logs from all contexts (production and development) are shown, but using the filter button and search bar at the top, the shown logs can be restricted. For example, to filter to only production logs, add `context:production` to the search bar. To only show logs from a certain revision, use `revision:` etc. You can also use full text search in the search bar. The full text search fill filter down the log entries to only those containing the text written, case-insensitively. By default logs from the last hour are shown. The time picker in the top right can be used to adjust the time frame that logs are shown for. The time zone of the timestamps shown is the time zone set in the time picker. The "view trace" button on the right of a log line shows up if a log line is correlated with a trace. This happens when a log line occurs within an active trace. Clicking this button will open the respective trace as an overlay. ### Traces ![Screenshot of the Traces page](./images/traces.png) The traces page shows all recent traces in the project. By default traces from all contexts (production and development) are shown, but using the filter button and search bar at the top, the shown traces can be restricted. For example, to filter to only production traces, add `context:production` to the search bar. To only show traces from a certain revision, use `revision:` etc. All traces that contain an incoming HTTP request are shown in the list. The text shown for each trace is the path of the request, and the duration of the trace in milliseconds. Clicking on a trace will open the trace view, which shows the full trace including all spans and logs that are part of the trace. ![Screenshot of the Trace view](./images/trace.png) For each span in the trace you can see the duration of the span, the name of the span, the start and end time, and the recorded attributes. By clicking on a span in the timeline, the details of that span will be shown in the summary panel at the bottom. The logs that are emitted as part of a given span are shown in the logs tab at the bottom. Changing the selected span will update which logs are shown in this panel. --- # About Early Access > Guide to Deno Deploy Early Access features, comparison with Deploy Classic, and getting started instructions for deployment. URL: https://docs.deno.com/deploy/early-access/ :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Deno Deploy Early Access (Deno DeployEA) is a complete revamp of the original Deploy, featuring: - Improved NPM compatibility and web framework support - Built-in OpenTelemetry integration - Integrated build system - Significantly enhanced underlying infrastructure Join the Early Access program Go to your Deno DeployEA dashboard :::note Deno DeployEA is in private beta. To use Deno Deploy EA you must join the Early Access program from the [Deploy Classic account settings page](https://dash.deno.com/account#early-access). ::: Deno DeployEA comes with a new dashboard at [app.deno.com](https://app.deno.com). In this dashboard, you can create new Deno DeployEA organizations that contain Deno DeployEA apps. Within a single organization, you cannot mix Deno DeployEA apps with Deploy Classic projects. You can switch between different organizations using the organization picker in the top left of the dashboard. ## What is Deno DeployEA? Deno Deploy is a serverless platform for running JavaScript and TypeScript applications in the cloud (or self-hosted on your own infrastructure). It provides a management plane for deploying and running applications through integrations like GitHub deployment. ## Comparison to Deploy Classic Deno DeployEA is a complete rework of Deploy Classic. It has a new dashboard, and a new execution environment that uses Deno 2.0 and is much more powerful than Deploy Classic. The below table compares the two versions of Deno Deploy. | Feature | Deno DeployEA | Deploy Classic | | ------------------------------- | ------------------------------ | --------------------------------------------------------------------------------------------------------------------------------------- | | Web interface | app.deno.com | dash.deno.com | | Dark mode | ✅ Supported | ❌ Not supported | | Builds | ✅ Fully integrated | 🟠 Runs in GitHub Actions, no live streamed logs in the dashboard, caching requires manual setup, changing config requires editing YAML | | Can run Deno apps | ✅ Full support | 🟠 Limited (no FFI, subprocesses, write permission) | | Can run Node apps | ✅ Full support | 🟠 Limited (no FFI, native addons, subprocesses, write permission, and degraded NPM compatibility) | | Can run Next.js/Astro/SvelteKit | ✅ First-class support | 🟠 Framework dependent, requires manual setup | | First class static sites | ✅ Supported | ❌ Not supported | | Environment Variables | ✅ Different dev/prod env vars | 🟠 One set of env vars for all deployments | | CDN caching | ✅ Supported | ❌ Not supported | | Web Cache API | ✅ Supported | ✅ Supported | | Databases | ⏳ Coming soon | 🟠 Deno KV | | Queues | ❌ Not supported | ✅ Supported | | Cron | ❌ Not supported | ✅ Supported | | Deploy from GitHub | ✅ Supported | ✅ Supported | | Deploy from CLI | ⏳ Coming soon | ✅ Supported | | Instant Rollback | ✅ Supported | ✅ Supported | | Logs | ✅ Supported | ✅ Supported | | Tracing | ✅ Supported | ❌ Not supported | | Metrics | ✅ Supported | ❌ Not supported | | OpenTelemetry export | ⏳ Work in progress | ❌ Not supported | | Regions | 2 | 6 | | Self hostable regions | ✅ Supported | ❌ Not supported | ## How to access EA To begin using Deno DeployEA: 1. Visit [app.deno.com](https://app.deno.com) to access the new dashboard 2. Create a new Deno DeployEA organization 3. Create your first application within this organization 4. Deploy from your GitHub repository or directly from the dashboard For detailed configuration instructions and framework-specific guides, please refer to our reference documentation. --- # deploy/early-access/reference/accounts.md > Information about user accounts, authentication via GitHub, and managing your profile in Deno Deploy Early Access. URL: https://docs.deno.com/deploy/early-access/reference/accounts :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Deno Deploy accounts are linked to GitHub accounts. You can only sign in to Deno Deploy using GitHub authentication. Your primary contact email address and name are synced from GitHub. Both your username and email address update on every sign in. After changing your email, login, or name on GitHub, sign in again to see these changes reflected in the Deno DeployEA dashboard. Currently, only accounts enrolled in the Early Access program can access Deno DeployEA. To join the program, visit the [account settings in Deploy Classic](https://dash.deno.com/account#early-access) and sign up. To access the Early Access Discord channel, connect your Discord account to your Deno Deploy account through the same Early Access settings. --- # deploy/early-access/reference/apps.md > Guide to managing applications in Deno Deploy Early Access, including app creation, configuration, GitHub integration, and deployment options. URL: https://docs.deno.com/deploy/early-access/reference/apps :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Applications are web services that serve traffic within an organization. Each application contains a history of revisions (previous versions), typically corresponding to Git commits when using the GitHub integration. Applications are identified by a slug, which must be unique within the organization and is used in default domain names. ## Creating an application To create an application: 1. Click the "+ Create App" button on the organization page 2. Select the GitHub repository to deploy from 3. Configure the app slug (name) 4. Set up build configuration 5. Add any required environment variables > ⚠️ Currently, applications must be linked to a GitHub repository during > creation. The build configuration determines how the application is built during the deployment process. Builds are automatically triggered on each push to the linked repository or when manually clicking "Deploy Default Branch". For detailed build configuration information, see the [Builds documentation](/deploy/early-access/reference/builds/). You can add environment variables during app creation by clicking "Edit Environment Variables". For more details on environment variables, see the [Environment Variables and Contexts](/deploy/early-access/reference/env-vars-and-contexts/) documentation. ## Limitations > ⚠️ Apps cannot currently be deleted. > ⚠️ Apps cannot currently be renamed. > ⚠️ Apps cannot currently be transferred to another organization. ## GitHub integration The GitHub integration enables automatic deployments of the app from a GitHub repository. Every push to the repository will trigger a new build of the app. Depending on the branch of the commit, the build will be deployed to different [timelines](/deploy/early-access/reference/timelines/). Apps will generally be linked to a GitHub repository on creation. However, it is possible to unlink the repository after creation, and optionally link it to a new GitHub repository. This can be done from the app settings page. Only accounts that have been authorized with the Deno Deploy GitHub app will be visible in the GitHub repository dropdown. You can authorize new orgs or repos by clicking the "+ Add another GitHub account" button in the user or organization dropdown, or the "Configure GitHub app permissions" button in the repository dropdown. This will redirect you to GitHub to authorize the Deno Deploy GitHub app with the selected GitHub account or organization. After authorizing, you will be redirected back to the app settings page, where you can select the new GitHub repository. --- # deploy/early-access/reference/builds.md > Detailed explanation of the build process in Deno Deploy Early Access, covering build triggers, stages, configuration options, caching, and the build environment. URL: https://docs.deno.com/deploy/early-access/reference/builds :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: In Deno DeployEA, each version of your application code is represented as a revision (or build). When deploying from GitHub, revisions generally map one-to-one to git commits in your repository. ## Build triggers Builds can be triggered in two ways: - **Manually**: Using the "Deploy Default Branch" button on the builds page, which deploys the default git branch (usually `main`). The dropdown menu lets you select a different branch. - **Automatically**: When a new commit is pushed to a GitHub repository linked to your app. ## Build stages A revision goes through these stages before becoming available: 1. **Queuing**: The revision waits to be assigned to a builder. 2. **Preparing**: A builder downloads the source code and restores any available build caches. 3. **Install**: The install command executes (if specified), typically downloading dependencies. 4. **Build**: The build command executes (if specified), creating a build artifact that is uploaded to the runtime infrastructure. 5. **Warm up**: A `GET /` request tests that the application boots correctly and can handle HTTP requests. 6. **Route**: The global infrastructure is configured to route requests to the new revision based on its timelines. If any step fails, the build enters a "Failed" state and does not receive traffic. Build logs are streamed live to the dashboard during the build process and remain available on the build page after completion. Build caching speeds up builds by reusing files that haven't changed between builds. This happens automatically for framework presets and the `DENO_DIR` dependency cache. You can cancel a running build using the "Cancel" button in the top-right corner of the build page. Builds automatically cancel after running for 5 minutes. ## Build configuration Build configuration defines how to convert source code into a deployable artifact. You can modify build configuration in three places: - During app creation by clicking "Edit build config" - In app settings by clicking "Edit" in the build configuration section - In the retry drawer on a failed build's page When creating an app, build configuration may be automatically detected from your repository if you're using a recognized framework or common build setup. ### Configuration options - **Framework preset**: Optimized configuration for supported frameworks like Next.js or Fresh. [Learn more about framework integrations](./frameworks/). - **Install command**: Shell command for installing dependencies, such as `npm install` or `deno install`. - **Build command**: Shell command for building the project, often a task from `package.json` or `deno.json`, such as `deno task build` or `npm run build`. - **Runtime configuration**: Determines how the application serves traffic: - **Dynamic**: For applications that respond to requests using a server (API servers, server-rendered websites, etc.) - **Entrypoint**: The JavaScript or TypeScript file to execute - **Arguments** (optional): Command-line arguments to pass to the application - **Static**: For static websites serving pre-rendered content - **Directory**: Folder containing static assets (e.g., `dist`, `.output`) - **Single page app mode** (optional): Serves `index.html` for paths that don't match static files instead of returning 404 errors ## Build environment The build environment runs on Linux using either x64 or ARM64 architecture. Available tools include: - `deno` (same version as at runtime) - `node` - `npm` - `npx` - `yarn` (v1) - `pnpm` - `git` - `tar` - `gzip` :::info All JavaScript inside of the builder is executed using Deno. The `node` command is actually a shim that translates Node.js invocations to `deno run`. Similarly, `npm`, `npx`, `yarn`, and `pnpm` run through Deno rather than Node.js. ::: Environment variables configured for the "Build" context are available during builds, but variables from "Production" or "Development" contexts are not. [Learn more about environment variables](./env-vars-and-contexts/). Builders have 8 GB of storage available during the build process. --- # deploy/early-access/reference/caching.md > Overview of CDN caching functionality in Deno Deploy Early Access, including cache configuration, directives, and best practices. URL: https://docs.deno.com/deploy/early-access/reference/caching :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Deno DeployEA includes a built-in CDN that can cache responses from your application. This improves performance for: - Static assets (images, CSS, JavaScript files) - API responses and server-rendered pages that don't change frequently Caching is enabled by default for all applications, but only responses with appropriate caching headers are actually cached. Deno DeployEA integrates with popular frameworks like Next.js to automatically optimize caching for features such as Incremental Static Regeneration (ISR). The CDN cache is tied to both the revision and context. When you deploy a new revision, the cache is automatically invalidated, ensuring users always see the latest version of your application. Note that browser caching may still serve older content if the `Cache-Control` header permits it. ## Caching a resource To cache a resource, set the `Cache-Control` header in your response. This standard HTTP header tells browsers and the CDN how to cache your content. ### Supported caching directives Deno DeployEA supports these caching directives: | Directive | Description | | ------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | `max-age` | Maximum time (in seconds) the response is considered fresh by both CDN and browsers. After this time, the response is considered stale and revalidated with the server. | | `s-maxage` | Maximum time (in seconds) the response is considered fresh by shared caches (CDNs only, not browsers). After this time, the response is revalidated with the server. | | `stale-while-revalidate` | Maximum time (in seconds) a stale response can be served while a fresh one is fetched in the background. | | `stale-if-error` | Maximum time (in seconds) a stale response can be served if the server returns an error. | | `immutable` | Indicates the response will never change, allowing indefinite caching. Ideal for content-hashed static assets. | | `no-store` | Prevents caching of the response. Use for dynamic content that should never be cached. | | `no-cache` | Requires revalidation with the server before serving from cache. Use for content that changes frequently but can benefit from conditional requests. | ### Additional caching headers - `Vary`: Specifies which request headers should be included in the cache key, creating separate cached versions based on those headers. - `Expires`: Sets an absolute expiration date for the response (alternative to `max-age`). do not change, such as images or CSS files. - `no-store`: The response should not be cached. This is useful for dynamic responses that should not be cached, such as API responses or server rendered pages. - `no-cache`: The response should be revalidated with the server before being served from the cache. This is useful for dynamic responses that may change frequently. The `Vary` header can be used to specify which request headers should be part of the cache key for the request. The `Expires` header can be used to specify an absolute expiration date for the response. This is an alternative to the `max-age` directive. --- # deploy/early-access/reference/domains.md > Complete guide to domain management in Deno Deploy Early Access, including organization domains, custom domains, DNS configuration, TLS certificates, and domain assignments. URL: https://docs.deno.com/deploy/early-access/reference/domains :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Every organization has a default domain used for all applications deployed within that organization. For example, an organization with the slug `acme-inc` would have a default domain of `acme-inc.deno.net`. An application named `my-app` would automatically receive the production domain `my-app.acme-inc.deno.net`. In addition to these default domains, you can add custom domains to your applications. Custom domains are domains that you own and control. To use a custom domain, you must: 1. Own the domain (purchased from a domain registrar) 2. Have access to edit its DNS records Custom domains belong to an organization and can be attached to any application within that organization. A custom domain can be added as: - A base domain (e.g., `example.com` or a specific subdomain) - A wildcard domain (e.g., `*.example.com`) A base domain works with a single application, while a wildcard domain offers more flexibility. You can either: - Assign the entire wildcard to one application (all subdomains point to the same app) - Partially assign it to multiple applications (different subdomains point to different apps) All custom domains require valid TLS certificates. Deno DeployEA can automatically provision these certificates using Let's Encrypt. ## Adding a custom domain 1. Go to the organization domains page (click your organization name in the top left corner, then the "Domains" tab) 2. Click "Add Domain" 3. Enter your domain (e.g., `example.com`) 4. Select whether to add just this domain or also include the wildcard subdomain 5. Click "Add Domain" This will open the domain configuration drawer. ### DNS configuration The domain configuration drawer shows the DNS records needed to: - Verify domain ownership - Generate TLS certificates - Route traffic to Deno DeployEA There are three possible configuration methods, depending on your domain registrar's capabilities: #### ANAME/ALIAS method (preferred) If your registrar supports `ANAME` or `ALIAS` records, this is the best option: - Add one `ANAME`/`ALIAS` record - Add one `CNAME` record for verification #### CNAME method Works well for subdomains but not for apex domains: - Add two `CNAME` records - Note: This method doesn't allow other DNS records (like `MX` records) on the same domain #### A record method Most compatible but requires more configuration: - Add one `A` record - Add one `CNAME` record for verification > Note: Currently, Deno DeployEA doesn't support IPv6. When using the > `ANAME/ALIAS` or `CNAME` methods, your domain will automatically use IPv6 when > supported. With the `A` method, you'll receive an email when it's time to add > an `AAAA` record. :::warning When using Cloudflare as your DNS provider, you **MUST** disable the proxying feature (orange cloud) for the `_acme-challenge` CNAME record, or verification and certificate provisioning will fail. ::: ### Verification After adding the DNS records, Deno DeployEA will verify your domain ownership. This process may take a few minutes depending on your DNS provider. You can leave the domain configuration drawer open during verification - it will refresh automatically when complete. You can manually trigger verification by clicking the "Provision Certificate" button. Successful verification also initiates TLS certificate provisioning. ### TLS certificate provisioning After domain verification, click "Provision Certificate" to generate a TLS certificate through Let's Encrypt. This process takes up to 90 seconds. Once provisioned, you'll see certificate details including expiration date and issue time. Certificates are automatically renewed near expiry. You can check the current certificate status in the domain configuration drawer. ## Assigning a custom domain to an application After adding a custom domain to your organization: 1. Go to the organization domains page 2. Click "Assign" next to the custom domain 3. Select the target application 4. If using a wildcard domain, choose whether to attach the base domain, the wildcard, or a specific subdomain 5. Click "Assign Domain" ## Unassigning a custom domain from an application 1. Go to the application settings page 2. Find the "Custom Domains" section 3. Click "Remove" next to the domain you want to unassign This removes the domain from the application but keeps it available in your organization for use with other applications. ## Removing a custom domain 1. Go to the organization domains page 2. Open the domain configuration drawer 3. Click "Delete" and confirm This removes the custom domain from your organization and deletes all domain assignments across all applications. also select whether you want to attach the base domain, the wildcard subdomain, or any specific subdomain to the application. Once you have selected the application and the domain, click on the "Assign Domain" button to confirm. ## Unassigning a custom domain from an application To unassign a custom domain from an application, go to the application settings page and remove the custom domain from the "Custom Domains" section using the "Remove" button. This will unassign the custom domain from the application, but will not remove the custom domain from the organization. The custom domain will still be available for use with other applications in the organization. ## Removing a custom domain To remove a custom domain from an organization, go to the organization domains page and open the domain configuration drawer. In the drawer, click on the "Delete" button and confirm. This will remove the custom domain from the organization and delete all custom domain assignments for that domain from all applications in the organization. --- # deploy/early-access/reference/env-vars-and-contexts.md > Guide to managing environment variables and contexts in Deno Deploy Early Access, including variable types, creation, editing, and accessing them in your code. URL: https://docs.deno.com/deploy/early-access/reference/env-vars-and-contexts :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Environment variables in Deno DeployEA allow you to configure your application with static values such as API keys or database connection strings. ## Types of environment variables Environment variables can be stored as: - **Plain text**: Visible in the UI and suitable for non-sensitive values like feature flags - **Secrets**: Never visible in the UI after creation, only readable from application code, suitable for sensitive values like API keys Variables can be set at: - **Application level**: Specific to a single application - **Organization level**: Applied to all applications in the organization, but can be overridden by application-level variables ## Contexts Each environment variable applies to one or more contexts. Contexts represent the logical "environments" in which your code runs, each with its own set of variables and secrets. By default, there are two contexts: - **Production**: Used for the production timeline serving production traffic - **Development**: Used for development timelines serving non-production traffic (preview URLs and branch URLs) :::info Need additional contexts? Please contact [support](../support). ::: Additionally, there is a **Build** context used during the build process. Environment variables in the Build context are only available during builds and aren't accessible in Production or Development contexts (and vice versa). This separation enables different configuration for build-time vs. runtime. Within a single application or organization, you cannot have multiple environment variables with the same name in the same context. You can, however, have variables with the same name in different non-overlapping contexts. ## Adding, editing and removing environment variables You can manage environment variables from several locations: - On the "New App" page while creating an application - In the application settings under the "Environment Variables" section - In the organization settings under the "Environment Variables" section In each location, click the relevant edit button to open the environment variables drawer. Changes only apply when you click "Save." Clicking "Cancel" discards your changes. To add a variable: 1. Click "Add Environment Variable" 2. Enter the name and value 3. Specify whether it's a secret 4. Select the contexts where it should apply You can also bulk import variables from a `.env` file: 1. Click "+ Add from .env file" 2. Paste the contents of your `.env` file 3. Click "Import variables" Note that lines starting with `#` are treated as comments. To remove a variable, click the "Remove" button next to it. To edit a variable, click the "Edit" button next to it to modify its name, value, secret status, or applicable contexts. ## Using environment variables in your code Access environment variables using the `Deno.env.get` API: ```ts const myEnvVar = Deno.env.get("MY_ENV_VAR"); ``` ## Predefined environment variables Deno DeployEA provides these predefined environment variables in all contexts: - `DENO_DEPLOYMENT_ID`: A unique identifier representing the entire configuration set (application ID, revision ID, context, and environment variables). Changes if any of these components change. - `DENO_REVISION_ID`: The ID of the currently running revision. More predefined variables will be added in the future. Note that you cannot manually set any environment variables starting with `DENO_*` as these are reserved system variables. ```ts const myEnvVar = Deno.env.get("MY_ENV_VAR"); ``` ## Predefined environment variables Deno DeployEA provides a set of predefined environment variables that are automatically set for each application. These environment variables are available in all contexts and can be used to access information about the application and the environment in which it is running. - `DENO_DEPLOYMENT_ID` - A unique identifier that represents the entire set of configuration that the application is running in. This includes the application ID, the revision ID, the context, and any applicable environment variables. This value changes if any of the above change. - `DENO_REVISION_ID` - The revision ID that is currently running. More predefined environment variables will be added in the future. It is not possible to manually set any environment variables that start with `DENO_*`. These environment variables are set by Deno DeployEA and are read-only. --- # or npm install @deno/astro-adapter > Detailed guide to supported JavaScript and TypeScript frameworks in Deno Deploy Early Access, including Next.js, Astro, Nuxt, SvelteKit, and more. URL: https://docs.deno.com/deploy/early-access/reference/frameworks :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Deno DeployEA supports a number of JavaScript and TypeScript frameworks out of the box. This means that you can use these frameworks without any additional configuration or setup. Natively supported frameworks are tested to work with Deno Deploy EA and are automatically detected when you create a new app. Deno DeployEA automatically optimizes the build and runtime configuration for these frameworks to be as optimal as possible. Frameworks not listed here are still likely to work, but may require manually configuring the install and/or build command and the runtime configuration in the build settings. Feel like a framework is missing? Let us know in the [Deno Deploy Discord channel](https://discord.gg/deno) or [contact Deno support](../support). ## Supported frameworks ### Next.js Next.js is a React framework for building full-stack web applications. You use React Components to build user interfaces, and Next.js for additional features and optimizations. Both pages and app router are supported out of the box. ISR, SSG, SSR, and PPR are supported. Caching is supported out of the box, including using the new `"use cache"`. `next/image` works out of the box. Next.js on Deno DeployEA always builds in standalone mode. Tracing is supported out of the box, and Next.js automatically emits some spans for incoming requests, routing, rendering, and other operations. ### Astro Astro is a web framework for building content-driven websites like blogs, marketing, and e-commerce. Astro leverages server rendering over client-side rendering in the browser as much as possible. For static Astro sites, no additional configuration is needed to use Deno Deploy EA. When using SSR in Astro with Deno Deploy EA, you need to install the [`@deno/astro-adapter`](https://github.com/denoland/deno-astro-adapter) package and configure your `astro.config.mjs` file to use the adapter: ```bash $ deno add npm:@deno/astro-adapter # or npm install @deno/astro-adapter # or yarn add @deno/astro-adapter # or pnpm add @deno/astro-adapter ``` ```diff title="astro.config.mjs" import { defineConfig } from 'astro/config'; + import deno from '@deno/astro-adapter'; export default defineConfig({ + output: 'server', + adapter: deno(), }); ``` Sharp image optimization is supported. The `astro:env` API is supported. ### Nuxt Create high-quality web applications with Nuxt, the open source framework that makes full-stack development with Vue.js intuitive. Nuxt requires no additional setup. ### SolidStart SolidStart is an open source meta-framework designed to unify components that make up a web application. It is built on top of Solid. SolidStart requires no additional setup. ### SvelteKit SvelteKit is a framework for rapidly developing robust, performant web applications using Svelte. SvelteKit requires no additional setup. ### Fresh Fresh is a full stack modern web framework for JavaScript and TypeScript developers. Fresh uses Preact as the JSX rendering engine. Fresh requires no additional setup. ### Lume Lume is a static site generator for building fast and modern websites using Deno. Lume requires no additional setup. ### Remix > ⚠️ **Experimental**: Remix is not yet fully supported. It is in the process of > being integrated into Deno DeployEA. Some features may not work as > expected. Please report any issues you encounter to the Deno team. --- # deploy/early-access/reference/index.md > Comprehensive reference guide for Deno Deploy Early Access covering accounts, organizations, applications, builds, observability, environments, and custom domains. URL: https://docs.deno.com/deploy/early-access/reference/ :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Specific terminology is used in Deploy Early Access. Use this reference guide to understand key concepts and details about the platform. ## Topics ### [Accounts](/deploy/early-access/reference/accounts) Information about user accounts, authentication, and personal settings in Deploy Early Access. ### [Organizations](/deploy/early-access/reference/organizations) Learn about creating and managing organizations, team members, roles, and permissions. ### [Applications](/deploy/early-access/reference/apps) Details about application creation, configuration, and lifecycle management. ### [Builds](/deploy/early-access/reference/builds) Understanding the build process, build configurations, and deployment pipelines. ### [Playgrounds](/deploy/early-access/reference/playgrounds) Write and deploy code without needing to create a git repository. ### [Observability](/deploy/early-access/reference/observability) Monitoring applications, accessing logs, metrics, and performance insights. ### [Environments](/deploy/early-access/reference/env-vars-and-contexts/) Managing different deployment environments including development, staging, and production. ### [Custom Domains](/deploy/early-access/reference/domains) Setting up and configuring custom domains for your applications. --- # deploy/early-access/reference/observability.md > Comprehensive overview of monitoring features in Deno Deploy Early Access, including logs, traces, metrics, and filtering options. URL: https://docs.deno.com/deploy/early-access/reference/observability :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Deno DeployEA provides comprehensive observability features to help you understand application performance, debug errors, and monitor usage. These features leverage OpenTelemetry and the [built-in OpenTelemetry integration in Deno](/runtime/fundamentals/open_telemetry/). The three main observability features in Deno DeployEA are: - **Logs**: Unstructured debug information emitted by your application code - **Traces**: Structured information about request handling, including execution time for each step and automatic capture of outbound I/O operations - **Metrics**: Structured, high-level data about application performance and usage, such as request count, error count, and latency ## Logs Logs in Deno DeployEA are captured using the standard `console` API and can be queried from the logs page in the dashboard. Logs are organized by application. You can use the search bar to filter logs based on various attributes and message content. When logs are emitted inside the context of a trace, they become associated with that specific trace and span. For such logs, a "View trace" button appears in the logs interface, allowing you to open the relevant trace in an overlay drawer for detailed inspection. ## Traces Traces in Deno DeployEA are captured in three ways: - **Automatically for built-in operations**: Incoming HTTP requests, outbound fetch calls, and other system operations are traced automatically. This cannot be disabled. - **Automatically for supported frameworks**: Frameworks like Next.js, Fresh, and Astro include built-in instrumentation. The specific frameworks and operations covered may change over time. - **Manually through custom instrumentation**: Your application code can create new traces or spans using the OpenTelemetry API. Traces are organized by application. The search bar lets you filter based on various attributes and span names. Clicking a trace opens the trace overlay drawer, showing all spans within that trace in a waterfall view. This visualization displays the start time, end time, and duration of each span, grouped by parent span with the root span at the top. Clicking any span shows its details at the bottom of the drawer, including all captured attributes. For example, outbound HTTP requests include the method, URL, and status code. The span details section also includes a "Logs" tab showing all logs emitted within the selected span's context. You can click "View logs" on any trace to open the logs page with the trace ID pre-filled in the search bar, showing all logs related to that trace. ## Metrics Metrics in Deno DeployEA are automatically captured for various operations such as incoming HTTP requests and outbound fetch calls. This automatic capture cannot be disabled. Metrics are organized by application and displayed in time-series graphs showing values over time. You can use the search bar to filter metrics based on various attributes. ## Filtering Logs, traces, and metrics can be filtered using these general attributes: - **Revision**: The ID of the application revision that emitted the data - **Context**: The context in which the data was emitted ("Production" or "Development") For logs and traces, this additional filter is available: - **Trace**: The ID of the trace containing the log or spans For traces only, these additional filters are available: - **HTTP Method**: The HTTP method of the request that triggered the trace - **HTTP Path**: The path of the request that triggered the trace - **HTTP Status**: The HTTP status code of the response ### Time range filter By default, the observability pages show data for the last hour. You can change this using the time range filter in the top right corner of each page. You can select predefined time ranges like "Last 1 hour," "Last 24 hours," or "Last 7 days," or set a custom time range by clicking the "Custom" button. Custom time ranges can be either absolute (specific start and end times) or relative (e.g., 3 days ago, 1 hour from now). Relative time ranges use the same syntax as Grafana: - `now` - the current time - `now-1h` - 1 hour ago - `now/h` - the start of the current hour - `now-1h/h` - the start of the previous hour - `now/d+3h` - 3 hours from the start of the current day - `now-1d/d` - the start of the previous day page. The time range filter can be set to a predefined time range, like "Last 1 hour", "Last 24 hours", or "Last 7 days", or a custom time range. The custom time range can be set by clicking on the "Custom" button. A custom time range can either be absolute (a specific start and end time) or relative (3 days ago, 1 hour from now, etc.). The time range filter is shown in the top right corner of the page. Relative time ranges use the same syntax as Grafana, where `now` is the current time, and `now-1h` is 1 hour ago. Furthermore syntax such as `now-1h/h` can be used to round the time to the nearest hour. Some examples: - `now-1h` - 1 hour ago - `now/h` - the start of the current hour - `now-1h/h` - the start of the previous hour - `now/d+3h` - 3 hours from the start of the current day - `now-1d/d` - the start of the previous day --- # deploy/early-access/reference/organizations.md > Guide to creating and managing organizations in Deno Deploy Early Access, including members, permissions, and organization administration. URL: https://docs.deno.com/deploy/early-access/reference/organizations :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Organizations are groups of users that collectively own apps and domains. When signing up for Deno DeployEA, each user can either create an organization or join an existing organization through invitation. All users must belong to an organization to use Deno DeployEA, as all resources are owned at the organization level. Organizations have both a name and a slug. The name is visible only to organization members and appears in the organization dropdown in both Deno Deploy EA and Deploy Classic. The slug forms part of the default domain for all applications in the organization. :::caution Organizations cannot be renamed, nor can their slug be changed after creation. ::: Every organization has a default domain used for production, git branch, and preview URLs for projects in that organization. For example, an organization with the slug `acme-inc` would have a default domain of `acme-inc.deno.net`. Organizations can have multiple members. Currently, all members have owner permissions for the organization, which means they can invite other members, create and delete apps, and manage domains. ## Create an organization Organizations in Deno DeployEA are created from the Deno Deploy Classic dashboard: 1. Visit the [Deploy Classic dashboard](https://dash.deno.com) and sign in with your GitHub account. 2. Click the "+" button in the organization dropdown in the top left corner of the screen. 3. Select "Try the new Deno Deploy" option. 4. Click the "Create Early Access organization" button. 5. Enter an organization name and slug, then click "Create". :::info Organization slugs must be unique across all Deno DeployEA organizations and cannot match any existing project name in Deno Deploy Classic. ::: ## Deleting an organization Organizations cannot currently be deleted from the dashboard. Please [contact Deno support](../support) if you need to delete an organization. ## Inviting users to an organization To invite a user: 1. Go to the organization settings page and click "+ Invite User" 2. Enter the user's GitHub account username (e.g., `ry`) 3. Optionally enter an email address to send the invitation to 4. Click "Invite" If you don't specify an email address, we'll attempt to send the invitation to the email in the user's public GitHub profile or another email we may have on record. After inviting a user, they will receive an email with an invite link (if we have their email address). They must click this link and accept the invitation to join the organization. You can also directly share the personalized invite link displayed in the members table after inviting a user. You can cancel an invitation before it's accepted by clicking the delete button next to the invited user in the members table and confirming by clicking "Save". This invalidates the previously sent invitation link. ## Removing users from an organization To remove a member from the organization, find the user in the members table in the organization settings, click the remove button, and confirm by clicking "Delete". "Delete". --- # deploy/early-access/reference/playgrounds.md > Write and deploy code completely from Deno Deploy, without the need for a git repository. URL: https://docs.deno.com/deploy/early-access/reference/playgrounds :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: Playground applications enable you to create, edit, and deploy applications entirely from the Deno DeployEA web dashboard, without needing to create a GitHub repository. Playgrounds contain one or more files (JavaScript, TypeScript, TSX, JSON, etc.) that you can edit directly in the playground editor. ## Creating a playground You can create playgrounds from the "Playgrounds" page in your organization. Click the "New Playground" button to create a basic "Hello World" playground. Using the dropdown on the "New Playground" button lets you create playgrounds from other templates, such as Next.js or Hono. ## Editing a playground To edit a playground, open it from the "Playgrounds" page in your organization. The playground editor consists of five main sections: - **Code editor**: The central area where you edit code for the currently selected file. Above the editor is a navbar showing the current file name, which you can click to edit. - **File browser**: Located on the left of the code editor, this panel shows all files in the playground. Click any file to open it in the editor. Create new files by clicking the "New" icon at the top of the file browser. Delete files using the delete button next to each file name. - **Top bar**: Located above the code editor, this contains action buttons for the playground. The "Deploy" button saves current changes and triggers a build. "Build Config" and "Env Variables" buttons open their respective configuration drawers. The left side of the top bar displays the playground URL (unless the playground hasn't been deployed yet). - **Bottom drawer**: Located beneath the code editor, this contains debugging tools including "Build Logs" that show build progress during deployment, and tabs for viewing logs and traces. - **Right drawer**: Located to the right of the code editor, this contains tools for inspecting application output. The "Preview" tab displays an iframe showing the deployed application, while "HTTP Explorer" lets you send individual HTTP requests to your deployment. The playground content automatically saves when you click the "Deploy" button or when the editor loses focus. ## Deleting a playground > ⚠️ Playgrounds cannot currently be deleted. ## Renaming a playground > ⚠️ Playgrounds cannot currently be renamed. ## Transferring a playground > ⚠️ Playgrounds cannot currently be transferred to another organization. ## Deleting a playground > ⚠️ Playgrounds can not currently be deleted. ## Renaming a playground > ⚠️ Playgrounds can not currently be renamed. ## Transferring a playground > ⚠️ Playgrounds can not currently be transferred to another organization. --- # deploy/early-access/reference/runtime.md > Details about the Deno Deploy Early Access runtime environment, including application lifecycle, startup, shutdown, and cold start optimization. URL: https://docs.deno.com/deploy/early-access/reference/runtime :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: In Deno DeployEA, all applications execute using a standard Deno runtime in a secure, isolated Linux environment. The Deno runtime used in Deno DeployEA is the standard Deno runtime, with full support for all features of the Deno CLI, including JSR and NPM dependencies, reading and writing to the file system, making network requests, spawning subprocesses, and loading FFI and node native addons. The Deno runtime runs using `--allow-all` permissions. Custom flags cannot be passed to the Deno runtime. ## Runtime environment The runtime environment is a Linux-based environment running either x64 or ARM64 architecture. The exact set of tools available in the runtime environment is subject to change and thus cannot be relied upon. Currently Deno DeployEA runs on Deno 2.3.2. ## Lifecycle Deno DeployEA runs applications in a serverless environment. This means that an application is not always running and is only started when a request is received. When no incoming traffic is received for a period of time, the application is stopped. Applications can be started and stopped at any time. They should start quickly to respond to incoming requests without delay. Multiple instances of the same application can run simultaneously. For example, one instance could be running in the US and another in Europe. Each instance is completely isolated from the others and they do not share CPU, memory, or disk resources. Multiple instances can also start in the same region when needed, such as to handle high traffic or during infrastructure updates. ### Startup When the system decides to start an application, it provisions a new sandbox environment for the application. This environment is isolated from all other applications. It then starts the application using the configured entrypoint and waits for the HTTP server to start. If the application crashes before the HTTP server starts, the request that triggered the start will fail with a 502 Bad Gateway error. Once the application is started, incoming requests are routed to it and responses are sent back to the client. ### Shutdown The application remains alive until no new incoming requests are received or responses (including response body bytes) are sent for a period of time. The exact timeout is between 5 seconds and 10 minutes. WebSocket connections that actively transmit data (including ping/pong frames) also keep the application alive. Once the system decides to stop the application, it sends a `SIGINT` signal to the application as a trigger to shut down. From this point on, the application has 5 seconds to shut down gracefully before it will be forcibly killed with a `SIGKILL` signal. ### Eviction Sometimes an isolate may shut down even if the application is actively receiving traffic. Some examples of when this can happen are: - An application was scaled up to handle load, but the load has decreased enough to be handled by a single instance again. - The underlying server executing the instance is too resource constrained to continue running this application instance. - The underlying infrastructure is being updated or has experienced a failure. When the system decides to evict an application, it attempts to divert traffic away from the instance being evicted as early as possible. Sometimes this means that a request will wait for a new instance to boot up even though an existing instance is already running. When an application only serves requests that finish quickly, evictions are usually unnoticeable. For applications that serve long-running requests or WebSockets, evictions can be more noticeable because the application may need to be evicted while still processing a request. The system will try to avoid these scenarios, but it is not always possible. After traffic has been diverted away from the old instance, the system sends a `SIGINT` signal to trigger a graceful shutdown. The application should finish processing any remaining requests quickly and shut down websockets and other long-running connections. Clients making long-running requests should be prepared to handle these disruptions and reconnect when disconnected. 5 seconds after the `SIGINT` signal is sent, the old instance will be forcibly killed with a `SIGKILL` signal if it has not already shut down gracefully. ## Cold starts Because applications are not always running, they may need to start when a request is received. This is called a cold start. Cold starts in Deno Deploy EA are highly optimized and complete within 100 milliseconds for hello world applications, and within a few hundred milliseconds for larger applications. Deno DeployEA uses multiple optimizations to enable fast cold starts: - Sandboxes and the Deno runtime are pre-provisioned to ensure they don't need to be created from scratch when starting an application. - Applications start immediately when the client sends the first TCP packet to establish a TLS connection. For fast-starting applications, depending on the network round trip latency, the application may already be running before the client sends the HTTP request. - File system access is optimized for frequently used startup files. Deno DeployEA analyzes file access patterns during the build step's warmup phase and optimizes the file system for faster access. When cold starts are slow, they can negatively impact user experience. To optimize your application for quick startup: 1. Minimize dependencies used by your application. 2. Load infrequently accessed code and dependencies lazily using dynamic `import()`. 3. Minimize I/O operations during startup, especially top-level `await` operations and network requests. If your application starts slowly, please [contact Deno support](../support) for help investigating the issue. --- # deploy/early-access/reference/timelines.md > Understanding deployment timelines in Deno Deploy Early Access, including production and development contexts, active revisions, rollbacks, and timeline locking. URL: https://docs.deno.com/deploy/early-access/reference/timelines :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: A timeline is a representation of the history of one branch of the application. Each timeline has a set of revisions, which are the individual items in the timeline. One of the revisions (usually the most recent one) is the "active" revision, which is the one that is currently serving traffic. The active revision receives traffic on all URLs that are assigned to the timeline. Each timeline is associated with a [context](./env-vars-and-contexts.md), which decides which environment variables are available to the code running in that timeline. By default, there are multiple timelines set up for each application: - **Production**: The production timeline contains all of the revisions from the default git branch. This is the timeline that serves production traffic. This timeline is associated with `https://..deno.net`, and any custom domains that are mapped to the application. It uses the production context. - **Git Branch / ``**: Each git branch has its own timeline. This timeline contains all of the revisions from that git branch. This timeline is associated with `https://--..deno.net`. It uses the development context. > There is also one timeline for each revision, that contains only that > revision. This is the timeline that backs the preview URL for that revision. > This timeline is associated with > `https://-..deno.net`. It uses the > development context. > > Preview timelines are not visible in timeline pages in the UI. You can view > the preview URL for a revision on that revision's build page. You can view the timelines that each revision is associated with on the revision's build page. You can also view the revisions that are associated with a given timeline from the timeline pages. ## Active revision Each timeline has an active revision. The active revision is the revision that is currently serving traffic for that timeline. You can view the active revision for a timeline on the timeline page. Usually, the active revision is the most recently built revision on the timeline. However, a different revision can be manually locked to be the active revision. This enables rollback, and timeline locking: ### Rollback Rollback is the process of reverting the active revision to a previous revision, usually because the newer revision has some sort of bug or issue. By rolling back to a known good revision, you can restore the application to a working state without having to deploy new code via Git, and waiting for a build to complete. Refer to "changing the active revision" below for more information on how to rollback a timeline. ### Timeline locking Timeline locking is the process of locking a timeline to a specific revision, to ensure that new builds do not automatically become the active revision. This is useful if you are in a feature freeze situation, for example during a big event, and want to de-risk by not allowing new builds to be deployed. When a timeline is locked to a specific revision you can still create new builds by pushing to Git, but they will not automatically become the active revision on the locked timeline. Refer to "changing the active revision" below for more information on how to lock a timeline to a specific revision. ### Changing the active revision On the timelines page you can lock any revision on that timeline to be the active revision. This will lock the timeline to that revision, and new builds will not automatically become the active revision on this timeline anymore. You can then either unlock the revision from the timeline, reverting back to the default behavior of the latest revision being the active revision, or you can lock a different revision to be the active revision. --- # deploy/early-access/support/index.md URL: https://docs.deno.com/deploy/early-access/support/ :::info You are viewing the documentation for Deno DeployEA. Looking for Deploy Classic documentation? [View it here](/deploy/). ::: If you have any questions or feedback about Deno DeployEA, please reach out to us on the [Deno Discord](https://discord.gg/deno) in the `#deploy-ea` channel or [contact us](mailto:deploy@deno.com). We are actively working on improving the platform and would love to hear your thoughts! --- # Deno Deployᴱᴬ Usage Guidelines > Important limitations, service level expectations, and terms of use for the Deno Deploy Early Access program. URL: https://docs.deno.com/deploy/early-access/usage As an early access product, Deno DeployEA currently has a number of limitations you should be aware of before using it: - Deno Deploy Pro account features do not yet extend to Deno Deploy EA - CLI deployment and deployment from GitHub Actions are not yet available in Deno DeployEA - Database features such as Deno KV are not yet available in Deno Deploy EA - Queues and Cron are not available in Deno DeployEA :::info Deno DeployEA is an early access product, and as such is not currently covered by our regular service level agreements. ::: The Deno company is now using Deno DeployEA to host our own websites and is putting significant efforts into ensuring service reliability. However, as this is a new system, occasional service interruptions may occur. While Deno DeployEA is in closed beta, we are not charging for usage of the platform. However, the [Acceptable Use Policy](/deploy/manual/acceptable-use-policy/) and [Terms and Conditions](/deploy/manual/terms-and-conditions/) still apply, and we reserve the right to terminate any user, organization, or app that we find to be in violation of these terms. in violation of these. --- # deploy/index.md URL: https://docs.deno.com/deploy/ --- # Backups URL: https://docs.deno.com/deploy/kv/manual/backup KV databases hosted on Deno Deploy can be continuously backed up to your own S3-compatible storage buckets. This is in addition to the replication and backups that we internally perform for all data stored in hosted Deno KV databases to ensure high availability and data durability. This backup happens continuously with very little lag, enabling _[point-in-time-recovery](https://en.wikipedia.org/wiki/Point-in-time_recovery)_ and live replication. Enabling backup for KV databases unlocks various interesting use-cases: - Retrieving a consistent snapshot of your data at any point in time in the past - Running a read-only data replica independent of Deno Deploy - Pushing data into your favorite data pipeline by piping mutations into streaming platforms and analytical databases like Kafka, BigQuery and ClickHouse ## Configuring backup to Amazon S3 First you must create a bucket on AWS: 1. Go to the [AWS S3 console](https://s3.console.aws.amazon.com/s3/home) 2. Click "Create bucket" 3. Enter a bucket name and choose a AWS region, then scroll down and click "Next" 1. Install the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html) 2. Run `aws s3api create-bucket --bucket --region --create-bucket-configuration LocationConstraint=` (replace `` and `` with your own values) Then, create an IAM policy with `PutObject` access to the bucket, attach it to an IAM user, and create access keys for that user: 1. Go to the [AWS IAM console](https://console.aws.amazon.com/iam/home) 2. Click "Policies" in the left sidebar 3. Click on "Create policy" 4. Select the "JSON" the policy editor and paste the following policy: ```json { "Version": "2012-10-17", "Statement": [ { "Sid": "KVBackup", "Effect": "Allow", "Action": "s3:PutObject", "Resource": "arn:aws:s3:::/*" } ] } ``` Replace `` with the name of the bucket you created earlier. 5. Click "Review policy" 6. Enter a name for the policy and click "Create policy" 7. Click "Users" in the left sidebar 8. Click "Add user" 9. Enter a name for the user and click "Next" 10. Click "Attach policies directly" 11. Search for the policy you created earlier and click the checkbox next to it 12. Click "Next" 13. Click "Create user" 14. Click on the user you just created 15. Click "Security credentials" and then "Create access key" 16. Select "Other", then click "Next" 17. Enter a description for the access key and click "Create access key" 18. Copy the access key ID and secret access key and save them somewhere safe. You will need them later, and you will not be able to retrieve them again. 1. Copy the following command to your terminal, and replace `` with the name of the bucket you created earlier, then run it: ``` aws iam create-policy --policy-name --policy-document '{"Version":"2012-10-17","Statement":[{"Sid":"KVBackup","Effect":"Allow","Action":"s3:PutObject","Resource":"arn:aws:s3:::/*"}]}' ``` 2. Copy the following command to your terminal, and replace `` with a name for the user you are creating, then run it: ``` aws iam create-user --user-name ``` 3. Copy the following command to your terminal, and replace `` with the ARN of the policy you created in step 1, and `` with the name of the user you created in the previous step, then run it: ``` aws iam attach-user-policy --policy-arn --user-name ``` 4. Copy the following command to your terminal, and replace `` with the name of the user you created in step 2, then run it: ``` aws iam create-access-key --user-name ``` 5. Copy the access key ID and secret access key and save them somewhere safe. You will need them later, and you will not be able to retrieve them again. Now visit the [Deno Deploy dashboard](https://dash.deno.com), and click on the "KV" tab in your project. Scroll to the "Backup" section, and click on "AWS S3". Enter the bucket name, access key ID, and secret access key you created earlier, and the region the bucket is in. Then click "Save". add backup to dashboard The backup will start immediately. Once the data has been backed up, and continuous backup is active, you will see the status change to "Active". ## Configuring backup to Google Cloud Storage Google Cloud Storage (GCS) is compatible with the S3 protocol, and can also be used as a backup target. First you must create a bucket on GCP: 1. Go to the [GCP Cloud Storage console](https://console.cloud.google.com/storage/browser) 2. Click on "Create" in the top bar 3. Enter a bucket name, choose a location, and click "Create" 1. Install the [gcloud CLI](https://cloud.google.com/sdk/docs/install) 2. Run `gcloud storage buckets create --location ` (replace `` and `` with your own values) Then, create a service account with `Storage Object Admin` access to the bucket, and create an HMAC access key for the service account: 1. Go to the [GCP IAM console](https://console.cloud.google.com/iam-admin/iam) 2. Click on "Service accounts" in the left sidebar 3. Click on "Create service account" 4. Enter a name for the service account and click "Done" 5. Copy the email for the service account you just created. You will need it later. 6. Go to the [GCP Cloud Storage console](https://console.cloud.google.com/storage/browser) 7. Click on the bucket you created earlier 8. Click on "Permissions" in the toolbar 9. Click "Grant access" 10. Paste the email for the service account you copied earlier into the "New principals" field 11. Select "Storage Object Admin" from the "Select a role" dropdown 12. Click "Save" 13. Click on "Settings" in the left sidebar (still in the Cloud Storage console) 14. Click on the "Interoperability" tab 15. Click on "Create a key for a service account" 16. Select the service account you created earlier 17. Click "Create key" 18. Copy the access key and secret access key and save them somewhere safe. You will need them later, and you will not be able to retrieve them again. 1. Run the following command, replacing `` with a name for the service account you are creating: ``` gcloud iam service-accounts create ``` 2. Run the following command, replacing `` with the name of the bucket you created earlier, and `` with the email of the service account you created in the previous step: ``` gsutil iam ch serviceAccount::objectAdmin gs:// ``` 3. Run the following command, replacing `` with the email of the service account you created in the previous step: ``` gcloud storage hmac create ``` 4. Copy the `accessId` and `secret` and save them somewhere safe. You will need them later, and you will not be able to retrieve them again. Now visit the [Deno Deploy dashboard](https://dash.deno.com), and click on the "KV" tab in your project. Scroll to the "Backup" section, and click on "Google Cloud Storage". Enter the bucket name, access key ID, and secret access key you created earlier, and the region the bucket is in. Then click "Save". The backup will start immediately. Once the data has been backed up, and continuous backup is active, you will see the status change to "Active". ## Using backups S3 backups can be used with the `denokv` tool. Please refer to the [documentation](https://github.com/denoland/denokv) for more details. --- # Scheduling cron tasks URL: https://docs.deno.com/deploy/kv/manual/cron The [`Deno.cron`](https://docs.deno.com/api/deno/~/Deno.cron) interface enables you to configure JavaScript or TypeScript code that executes on a configurable schedule using [cron syntax](https://en.wikipedia.org/wiki/Cron). In the example below, we configure a block of JavaScript code that will execute every minute. ```ts Deno.cron("Log a message", "* * * * *", () => { console.log("This will print once a minute."); }); ``` It's also possible to use JavaScript objects to define the cron schedule. In the example below, we configure a block of JavaScript code that will execute once an hour. ```ts Deno.cron("Log a message", { hour: { every: 1 } }, () => { console.log("This will print once an hour."); }); ``` `Deno.cron` takes three arguments: - A human-readable name for the cron task - A cron schedule string or JavaScript object that defines a schedule on which the cron job will run - a function to be executed on the given schedule If you are new to cron syntax, there are a number of third party modules [like this one](https://www.npmjs.com/package/cron-time-generator) that will help you generate cron schedule strings. ## Retrying failed runs Failed cron invocations are automatically retried with a default retry policy. If you would like to specify a custom retry policy, you can use the `backoffSchedule` property to specify an array of wait times (in milliseconds) to wait before retrying the function call again. In the following example, we will attempt to retry failed callbacks three times - after one second, five seconds, and then ten seconds. ```ts Deno.cron("Retry example", "* * * * *", { backoffSchedule: [1000, 5000, 10000], }, () => { throw new Error("Deno.cron will retry this three times, to no avail!"); }); ``` ## Design and limitations Below are some design details and limitations to be aware of when using `Deno.cron`. ### Tasks must be defined at the top level module scope The [`Deno.cron`](https://docs.deno.com/api/deno/~/Deno.cron) interface is designed to support static definition of cron tasks based on pre-defined schedules. All `Deno.cron` tasks must be defined at the top-level of a module. Any nested `Deno.cron` definitions (e.g. inside [`Deno.serve`](https://docs.deno.com/api/deno/~/Deno.serve) handler) will result in an error or will be ignored. If you need to schedule tasks dynamically during your Deno program execution, you can use the [Deno Queues](./queue_overview) APIs. ### Time zone `Deno.cron` schedules are specified using UTC time zone. This helps avoid issues with time zones which observe daylight saving time. ### Overlapping executions It's possible for the next scheduled invocation of your cron task to overlap with the previous invocation. If this occurs, `Deno.cron` will skip the next scheduled invocation in order to avoid overlapping executions. ### Day-of-week numeric representation `Deno.cron` does not use 0-based day-of-week numeric representation. Instead, it uses 1-7 (or SUN-SAT) to represent Sunday through Saturday. This may be different compared to other cron engines which use 0-6 representation. ## Usage on Deno Deploy With [Deno Deploy](https://deno.com/deploy), you can run your background tasks on V8 isolates in the cloud. When doing so, there are a few considerations to keep in mind. ### Differences with Deno CLI Like other Deno runtime built-ins (like queues and Deno KV), the `Deno.cron` implementation works slightly differently on Deno Deploy. #### How cron works by default The implementation of `Deno.cron` in the Deno runtime keeps execution state in-memory. If you run multiple Deno programs that use `Deno.cron`, each program will have its own independent set of cron tasks. #### How cron works on Deno Deploy Deno Deploy provides a serverless implementation of `Deno.cron` that is designed for high availability and scale. Deno Deploy automatically extracts your `Deno.cron` definitions at deployment time, and schedules them for execution using on-demand isolates. Your latest production deployment defines the set of active cron tasks that are scheduled for execution. To add, remove, or modify cron tasks, simply modify your code and create a new production deployment. Deno Deploy guarantees that your cron tasks are executed at least once per each scheduled time interval. This generally means that your cron handler will be invoked once per scheduled time. In some failure scenarios, the handler may be invoked multiple times for the same scheduled time. ### Cron dashboard When you make a production deployment that includes a cron task, you can view a list of all your cron tasks in the [Deploy dashboard](https://dash.deno.com/projects) under the `Cron` tab for your project. ![a listing of cron tasks in the Deno dashboard](./images/cron-tasks.png) ### Pricing `Deno.cron` invocations are charged at the same rate as inbound HTTP requests to your deployments. Learn more about pricing [here](https://deno.com/deploy/pricing). ### Deploy-specific limitations - `Deno.cron` is only available for production deployments (not preview deployments) - The exact invocation time of your `Deno.cron` handler may vary by up to a minute from the scheduled time ## Cron configuration examples Here are a few common cron configurations, provided for your convenience. ```ts title="Run once a minute" Deno.cron("Run once a minute", "* * * * *", () => { console.log("Hello, cron!"); }); ``` ```ts title="Run every fifteen minutes" Deno.cron("Run every fifteen minutes", "*/15 * * * *", () => { console.log("Hello, cron!"); }); ``` ```ts title="Run once an hour on the hour" Deno.cron("Run once an hour on the hour", "0 * * * *", () => { console.log("Hello, cron!"); }); ``` ```ts title="Run every three hours" Deno.cron("Run every three hours", "0 */3 * * *", () => { console.log("Hello, cron!"); }); ``` ```ts title="Run every day at 1am" Deno.cron("Run every day at 1am", "0 1 * * *", () => { console.log("Hello, cron!"); }); ``` ```ts title="Run every Wednesday at midnight" Deno.cron("Run every Wednesday at midnight", "0 0 * * WED", () => { console.log("Hello, cron!"); }); ``` ```ts title="Run on the first of the month at midnight" Deno.cron("Run on the first of the month at midnight", "0 0 1 * *", () => { console.log("Hello, cron!"); }); ``` --- # Data Modeling in TypeScript URL: https://docs.deno.com/deploy/kv/manual/data_modeling_typescript In TypeScript applications, it is usually desirable to create strongly-typed, well-documented objects to contain the data that your application operates on. Using [interfaces](https://www.typescriptlang.org/docs/handbook/2/objects.html) or [classes](https://www.typescriptlang.org/docs/handbook/2/classes.html), you can describe both the shape and behavior of objects in your programs. If you are using Deno KV, however, there is a bit of extra work required to persist and retrieve objects that are strongly typed. In this guide, we'll cover strategies for working with strongly typed objects going into and back out from Deno KV. ## Using interfaces and type assertions When storing and retrieving application data in Deno KV, you might want to begin by describing the shape of your data using TypeScript interfaces. Below is an object model which describes some key components of a blogging system: ```ts title="model.ts" export interface Author { username: string; fullName: string; } export interface Post { slug: string; title: string; body: string; author: Author; createdAt: Date; updatedAt: Date; } ``` This object model describes a blog post and an associated author. With Deno KV, you can use these TypeScript interfaces like [data transfer objects (DTOs)](https://martinfowler.com/bliki/LocalDTO.html) - a strongly typed wrapper around the otherwise untyped objects you might send to or receive from Deno KV. Without any additional work, you can happily store the contents of one of these DTOs in Deno KV. ```ts import { Author } from "./model.ts"; const kv = await Deno.openKv(); const a: Author = { username: "acdoyle", fullName: "Arthur Conan Doyle", }; await kv.set(["authors", a.username], a); ``` When retrieving this same object from Deno KV, however, it won't by default have type information associated with it. If you know the shape of the object that was stored for the key, however, you can use [type assertion](https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#type-assertions) to inform the TypeScript compiler about the shape of an object. ```ts import { Author } from "./model.ts"; const kv = await Deno.openKv(); const r = await kv.get(["authors", "acdoyle"]); const ac = r.value as Author; console.log(ac.fullName); ``` You can also specify an optional [type parameter](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.get) for `get`: ```ts import { Author } from "./model.ts"; const kv = await Deno.openKv(); const r = await kv.get(["authors", "acdoyle"]); console.log(r.value.fullName); ``` For simpler data structures, this technique may be sufficient. But often, you will want or need to apply some business logic when creating or accessing your domain objects. When this need arises, you can develop a set of pure functions that can operate on your DTOs. ## Encapsulating business logic with a service layer When your application's persistence needs become more complex - such as when you need to create [secondary indexes](./secondary_indexes) to query your data by different keys, or maintain relationships between objects - you will want to create a set of functions to sit on top of your DTOs to ensure that the data being passed around is valid (and not merely typed correctly). From our business objects above, the `Post` object is complex enough where it is likely to need a small layer of code to save and retrieve an instance of the object. Below is an example of two functions that wrap the underlying Deno KV APIs, and return strongly typed object instances for the `Post` interface. Notably, we need to store an identifier for an `Author` object, so we can retrieve author information from KV later. ```ts import { Author, Post } from "./model.ts"; const kv = await Deno.openKv(); interface RawPost extends Post { authorUsername: string; } export async function savePost(p: Post): Promise { const postData: RawPost = Object.assign({}, p, { authorUsername: p.author.username, }); await kv.set(["posts", p.slug], postData); return p; } export async function getPost(slug: string): Promise { const postResponse = await kv.get(["posts", slug]); const rawPost = postResponse.value as RawPost; const authorResponse = await kv.get(["authors", rawPost.authorUsername]); const author = authorResponse.value as Author; const post = Object.assign({}, postResponse.value, { author, }) as Post; return post; } ``` This thin layer uses a `RawPost` interface, which extends the actual `Post` interface, to include some additional data that is used to reference data at another index (the associated `Author` object). The `savePost` and `getPost` functions take the place of a direct Deno KV `get` or `set` operation, so that they can properly serialize and "hydrate" model objects for us with appropriate types and associations. --- # Deno KV Quick Start URL: https://docs.deno.com/deploy/kv/manual/ **Deno KV** is a [key-value database](https://en.wikipedia.org/wiki/Key%E2%80%93value_database) built directly into the Deno runtime, available in the [`Deno.Kv` namespace](https://docs.deno.com/api/deno/~/Deno.Kv). It can be used for many kinds of data storage use cases, but excels at storing simple data structures that benefit from very fast reads and writes. Deno KV is available in the Deno CLI and on [Deno Deploy](./on_deploy). :::caution Deno KV is still in development and may change. To use it, you must pass the `--unstable-kv` flag to Deno. ::: Let's walk through the key features of Deno KV. ## Opening a database In your Deno program, you can get a reference to a KV database using [`Deno.openKv()`](https://docs.deno.com/api/deno/~/Deno.openKv). You may pass in an optional file system path to where you'd like to store your database, otherwise one will be created for you based on the current working directory of your script. ```ts const kv = await Deno.openKv(); ``` ## Creating, updating, and reading a key-value pair Data in Deno KV is stored as key-value pairs, much like properties of a JavaScript object literal or a [Map](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map). [Keys](./key_space) are represented as an array of JavaScript types, like `string`, `number`, `bigint`, or `boolean`. Values can be arbitrary JavaScript objects. In this example, we create a key-value pair representing a user's UI preferences, and save it with [`kv.set()`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.set). ```ts const kv = await Deno.openKv(); const prefs = { username: "ada", theme: "dark", language: "en-US", }; const result = await kv.set(["preferences", "ada"], prefs); ``` Once a key-value pair is set, you can read it from the database with [`kv.get()`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.get): ```ts const entry = await kv.get(["preferences", "ada"]); console.log(entry.key); console.log(entry.value); console.log(entry.versionstamp); ``` Both `get` and `list` [operations](./operations) return a [KvEntry](https://docs.deno.com/api/deno/~/Deno.KvEntry) object with the following properties: - `key` - the array key you used to set the value - `value` - the JavaScript object you set for this key - `versionstamp` - a generated value used to determine if a key has been updated. The `set` operation is also used to update objects that already exist for a given key. When a key's value is updated, its `versionstamp` will change to a new generated value. ## Listing several key-value pairs To get values for a finite number of keys, you may use [`kv.getMany()`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.getMany). Pass in several keys as arguments, and you'll receive an array of values for each key. Note that **values and versionstamps can be `null`** if no value exists for the given key(s). ```ts const kv = await Deno.openKv(); const result = await kv.getMany([ ["preferences", "ada"], ["preferences", "grace"], ]); result[0].key; // ["preferences", "ada"] result[0].value; // { ... } result[0].versionstamp; // "00000000000000010000" result[1].key; // ["preferences", "grace"] result[1].value; // null result[1].versionstamp; // null ``` Often, it is useful to retrieve a list of key-value pairs from all keys that share a given prefix. This type of operation is possible using [`kv.list()`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.list). In this example, we get a list of key-value pairs that share the `"preferences"` prefix. ```ts const kv = await Deno.openKv(); const entries = kv.list({ prefix: ["preferences"] }); for await (const entry of entries) { console.log(entry.key); // ["preferences", "ada"] console.log(entry.value); // { ... } console.log(entry.versionstamp); // "00000000000000010000" } ``` Returned keys are ordered lexicographically based on the next component of the key after the prefix. So KV pairs with these keys: - `["preferences", "ada"]` - `["preferences", "bob"]` - `["preferences", "cassie"]` Will be returned in that order by `kv.list()`. Read operations can either be performed in [**strong or eventual consistency mode**](./operations). Strong consistency mode guarantees that the read operation will return the most recently written value. Eventual consistency mode may return a stale value, but is faster. By contrast, writes are always performed in strong consistency mode. ## Deleting key-value pairs You can delete a key from the database using [`kv.delete()`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.delete). No action is taken if no value is found for the given key. ```ts const kv = await Deno.openKv(); await kv.delete(["preferences", "alan"]); ``` ## Atomic transactions Deno KV is capable of executing [atomic transactions](./transactions), which enables you to conditionally execute one or many data manipulation operations at once. In the following example, we create a new preferences object only if it hasn't been created already. ```ts const kv = await Deno.openKv(); const key = ["preferences", "alan"]; const value = { username: "alan", theme: "light", language: "en-GB", }; const res = await kv.atomic() .check({ key, versionstamp: null }) // `null` versionstamps mean 'no value' .set(key, value) .commit(); if (res.ok) { console.log("Preferences did not yet exist. Inserted!"); } else { console.error("Preferences already exist."); } ``` Learn more about transactions in Deno KV [here](./transactions). ## Improve querying with secondary indexes [Secondary indexes](./secondary_indexes) store the same data by multiple keys, allowing for simpler queries of the data you need. Let's say that we need to be able to access user preferences by both username AND email. To enable this, you could provide a function that wraps the logic to save the preferences to create two indexes. ```ts const kv = await Deno.openKv(); async function savePreferences(prefs) { const key = ["preferences", prefs.username]; // Set the primary key const r = await kv.set(key, prefs); // Set the secondary key's value to be the primary key await kv.set(["preferencesByEmail", prefs.email], key); return r; } async function getByUsername(username) { // Use as before... const r = await kv.get(["preferences", username]); return r; } async function getByEmail(email) { // Look up the key by email, then second lookup for actual data const r1 = await kv.get(["preferencesByEmail", email]); const r2 = await kv.get(r1.value); return r2; } ``` Learn more about [secondary indexes in the manual here](./secondary_indexes). ## Watching for updates in Deno KV You can also listen for updates from Deno KV with `kv.watch()`, which will emit a new value or values of the key or keys you provide. In the below chat example, we watch for updates on the key `["last_message_id", roomId]`. We retrieve `messageId`, which we then use with `kv.list()` to grab all the new messages from `seen` and `messageId`. ```ts let seen = ""; for await (const [messageId] of kv.watch([["last_message_id", roomId]])) { const newMessages = await Array.fromAsync(kv.list({ start: ["messages", roomId, seen, ""], end: ["messages", roomId, messageId, ""], })); await websocket.write(JSON.stringify(newMessages)); seen = messageId; } ``` Learn more about [using Deno KV watch here](./operations#watch). ## Production usage Deno KV is available for use in live applications on [Deno Deploy](./on_deploy). In production, Deno KV is backed by [FoundationDB](https://www.foundationdb.org/), the open source key-value store created by Apple. **No additional configuration is necessary** to run your Deno programs that use KV on Deploy - a new Deploy database will be provisioned for you when required by your code. Learn more about Deno KV on Deno Deploy [here](./on_deploy). ## Testing By default, [`Deno.openKv()`](https://docs.deno.com/api/deno/~/Deno.openKv) creates or opens a persistent store based on the path from which the script that invoked it was run. This isn't usually desirable for tests, which need to produce the same behavior when run many times in a row. To test code that uses Deno KV, you can use the special argument `":memory:"` to create an ephemeral Deno KV datastore. ```ts async function setDisplayName( kv: Deno.Kv, username: string, displayname: string, ) { await kv.set(["preferences", username, "displayname"], displayname); } async function getDisplayName( kv: Deno.Kv, username: string, ): Promise { return (await kv.get(["preferences", username, "displayname"])) .value as string; } Deno.test("Preferences", async (t) => { const kv = await Deno.openKv(":memory:"); await t.step("can set displayname", async () => { const displayName = await getDisplayName(kv, "example"); assertEquals(displayName, null); await setDisplayName(kv, "example", "Exemplary User"); const displayName = await getDisplayName(kv, "example"); assertEquals(displayName, "Exemplary User"); }); }); ``` This works because Deno KV is backed by SQLite when run for local development. Just like in-memory SQLite databases, multiple ephemeral Deno KV stores can exist at once without interfering with one another. For more information about special database addressing modes, see [the SQLite docs on the topic](https://www.sqlite.org/inmemorydb.html). ## Next steps At this point, you're just beginning to scratch the surface with Deno KV. Be sure to check out our guide on the [Deno KV key space](./key_space), and a collection of [tutorials and example applications](../tutorials/index.md) here. --- # Key Expiration (TTL for keys) URL: https://docs.deno.com/deploy/kv/manual/key_expiration Since version 1.36.2, Deno KV supports key expiration, allowing developers to control time to live (TTL) for keys in a KV database. This allows an expiration timestamp to be associated with a key, after which the key will be automatically deleted from the database: ```ts const kv = await Deno.openKv(); // `expireIn` is the number of milliseconds after which the key will expire. function addSession(session: Session, expireIn: number) { await kv.set(["sessions", session.id], session, { expireIn }); } ``` Key expiration is supported on both Deno CLI and Deno Deploy. ## Atomic expiration of multiple keys If multiple keys are set in the same atomic operation and have the same `expireIn` value, the expiration of those keys will be atomic. For example: ```ts const kv = await Deno.openKv(); function addUnverifiedUser( user: User, verificationToken: string, expireIn: number, ) { await kv.atomic() .set(["users", user.id], user, { expireIn }) .set(["verificationTokens", verificationToken], user.id, { expireIn }) .commit(); } ``` ## Caveats The expire timestamp specifies the _earliest_ time after which the key can be deleted from the database. An implementation is allowed to expire a key at any time after the specified timestamp, but not before. If you need to strictly enforce an expiration time (e.g. for security purposes), please also add it as a field of your value and do a check after retrieving the value from the database. --- # Key Space URL: https://docs.deno.com/deploy/kv/manual/key_space Deno KV is a key value store. The key space is a flat namespace of key+value+versionstamp pairs. Keys are sequences of key parts, which allow modeling of hierarchical data. Values are arbitrary JavaScript objects. Versionstamps represent when a value was inserted / modified. ## Keys Keys in Deno KV are sequences of key parts, which can be `string`s, `number`s, `boolean`s, `Uint8Array`s, or `bigint`s. Using a sequence of parts, rather than a single string eliminates the possibility of delimiter injection attacks, because there is no visible delimiter. > A key injection attack occurs when an attacker manipulates the structure of a > key-value store by injecting delimiters used in the key encoding scheme into a > user controlled variable, leading to unintended behavior or unauthorized > access. For example, consider a key-value store using a slash (/) as a > delimiter, with keys like "users/alice/settings" and "users/bob/settings". An > attacker could create a new user with the name "alice/settings/hacked" to form > the key "users/alice/settings/hacked/settings", injecting the delimiter and > manipulating the key structure. In Deno KV, the injection would result in the > key `["users", "alice/settings/hacked", "settings"]`, which is not harmful. Between key parts, invisible delimiters are used to separate the parts. These delimiters are never visible, but ensure that one part can not be confused with another part. For example, the key parts `["abc", "def"]`, `["ab", "cdef"]`, `["abc", "", "def"]` are all different keys. Keys are case sensitive and are ordered lexicographically by their parts. The first part is the most significant, and the last part is the least significant. The order of the parts is determined by both the type and the value of the part. ### Key Part Ordering Key parts are ordered lexicographically by their type, and within a given type, they are ordered by their value. The ordering of types is as follows: 1. `Uint8Array` 1. `string` 1. `number` 1. `bigint` 1. `boolean` Within a given type, the ordering is: - `Uint8Array`: byte ordering of the array - `string`: byte ordering of the UTF-8 encoding of the string - `number`: -Infinity < -1.0 < -0.5 < -0.0 < 0.0 < 0.5 < 1.0 < Infinity < NaN - `bigint`: mathematical ordering, largest negative number first, largest positive number last - `boolean`: false < true This means that the part `1.0` (a number) is ordered before the part `2.0` (also a number), but is greater than the part `0n` (a bigint), because `1.0` is a number and `0n` is a bigint, and type ordering has precedence over the ordering of values within a type. ### Key Examples ```js ["users", 42, "profile"]; // User with ID 42's profile ["posts", "2023-04-23", "comments"]; // Comments for all posts on 2023-04-23 ["products", "electronics", "smartphones", "apple"]; // Apple smartphones in the electronics category ["orders", 1001, "shipping", "tracking"]; // Tracking information for order ID 1001 ["files", new Uint8Array([1, 2, 3]), "metadata"]; // Metadata for a file with Uint8Array identifier ["projects", "openai", "tasks", 5]; // Task with ID 5 in the OpenAI project ["events", "2023-03-31", "location", "san_francisco"]; // Events in San Francisco on 2023-03-31 ["invoices", 2023, "Q1", "summary"]; // Summary of Q1 invoices for 2023 ["teams", "engineering", "members", 1n]; // Member with ID 1n in the engineering team ``` ### Universally Unique Lexicographically Sortable Identifiers (ULIDs) Key part ordering allows keys consisting of timestamps and ID parts to be listed chronologically. Typically, you can generate a key using the following: [`Date.now()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/now) and [`crypto.randomUUID()`](https://developer.mozilla.org/en-US/docs/Web/API/Crypto/randomUUID): ```js async function setUser(user) { await kv.set(["users", Date.now(), crypto.randomUUID()], user); } ``` Run multiple times sequentially, this produces the following keys: ```js ["users", 1691377037923, "8c72fa25-40ad-42ce-80b0-44f79bc7a09e"]; // First user ["users", 1691377037924, "8063f20c-8c2e-425e-a5ab-d61e7a717765"]; // Second user ["users", 1691377037925, "35310cea-58ba-4101-b09a-86232bf230b2"]; // Third user ``` However, having the timestamp and ID represented within a single key part may be more straightforward in some cases. You can use a [Universally Unique Lexicographically Sortable Identifier (ULID)](https://github.com/ulid/spec) to do this. This type of identifier encodes a UTC timestamp, is lexicographically sortable and is cryptographically random by default: ```js import { ulid } from "jsr:@std/ulid"; const kv = await Deno.openKv(); async function setUser(user) { await kv.set(["users", ulid()], user); } ``` ```js ["users", "01H76YTWK3YBV020S6MP69TBEQ"]; // First user ["users", "01H76YTWK4V82VFET9YTYDQ0NY"]; // Second user ["users", "01H76YTWK5DM1G9TFR0Y5SCZQV"]; // Third user ``` Furthermore, you can generate ULIDs monotonically increasingly using `monotonicUlid` function: ```js import { monotonicUlid } from "jsr:@std/ulid"; async function setUser(user) { await kv.set(["users", monotonicUlid()], user); } ``` ```js // Strict ordering for the same timestamp by incrementing the least-significant random bit by 1 ["users", "01H76YTWK3YBV020S6MP69TBEQ"]; // First user ["users", "01H76YTWK3YBV020S6MP69TBER"]; // Second user ["users", "01H76YTWK3YBV020S6MP69TBES"]; // Third user ``` ## Values Values in Deno KV can be arbitrary JavaScript values that are compatible with the [structured clone algorithm][structured clone algorithm]. This includes: - `undefined` - `null` - `boolean` - `number` - `string` - `bigint` - `Uint8Array` - `Array` - `Object` - `Map` - `Set` - `Date` - `RegExp` Objects and arrays can contain any of the above types, including other objects and arrays. `Map`s and `Set`s can contain any of the above types, including other `Map`s and `Set`s. Circular references within values are supported. Objects with a non-primitive prototype are not supported (such as class instances or Web API objects). Functions and symbols can also not be serialized. ### `Deno.KvU64` type In addition to structured serializable values, the special value `Deno.KvU64` is also supported as a value. This object represents a 64-bit unsigned integer, represented as a bigint. It can be used with the `sum`, `min`, and `max` KV operations. It can not be stored within an object or array. It must be stored as a top-level value. It can be created with the `Deno.KvU64` constructor: ```js const u64 = new Deno.KvU64(42n); ``` ### Value Examples ```js,ignore undefined; null; true; false; 42; -42.5; 42n; "hello"; new Uint8Array([1, 2, 3]); [1, 2, 3]; { a: 1, b: 2, c: 3 }; new Map([["a", 1], ["b", 2], ["c", 3]]); new Set([1, 2, 3]); new Date("2023-04-23"); /abc/; // Circular references are supported const a = {}; const b = { a }; a.b = b; // Deno.KvU64 is supported new Deno.KvU64(42n); ``` ## Versionstamp All data in the Deno KV key-space is versioned. Every time a value is inserted or modified, a versionstamp is assigned to it. Versionstamps are monotonically increasing, non-sequential, 12 byte values that represent the time that the value was modified. Versionstamps do not represent real time, but rather the order in which the values were modified. Because versionstamps are monotonically increasing, they can be used to determine whether a given value is newer or older than another value. This can be done by comparing the versionstamps of the two values. If versionstamp A is greater than versionstamp B, then value A was modified more recently than value B. ```js versionstampA > versionstampB; "000002fa526aaccb0000" > "000002fa526aacc90000"; // true ``` All data modified by a single transaction are assigned the same versionstamp. This means that if two `set` operations are performed in the same atomic operation, then the versionstamp of the new values will be the same. Versionstamps are used to implement optimistic concurrency control. Atomic operations can contain checks that ensure that the versionstamp of the data they are operating on matches a versionstamp passed to the operation. If the versionstamp of the data is not the same as the versionstamp passed to the operation, then the transaction will fail and the operation will not be applied. [structured clone algorithm]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm --- # Using KV in Node.js URL: https://docs.deno.com/deploy/kv/manual/node Connecting to a Deno KV database in Node.js is supported via our [official client library on npm](https://www.npmjs.com/package/@deno/kv). You can find usage instructions for this option below. ## Installation and usage Use your preferred npm client to install the client library for Node.js using one of the commands below. ```sh npm install @deno/kv ``` ```sh pnpm add @deno/kv ``` ```sh yarn add @deno/kv ``` Once you've added the package to your Node project, you can import the `openKv` function (supports both ESM `import` and CJS `require`-based usage): ```js import { openKv } from "@deno/kv"; // Connect to a KV instance const kv = await openKv(""); // Write some data await kv.set(["users", "alice"], { name: "Alice" }); // Read it back const result = await kv.get(["users", "alice"]); console.log(result.value); // { name: "Alice" } ``` By default, the access token used for authentication comes from the `DENO_KV_ACCESS_TOKEN` environment variable. You can also pass it explicitly: ```js import { openKv } from "@deno/kv"; const kv = await openKv("", { accessToken: myToken }); ``` Once your Deno KV client is initialized, the same API available in Deno may be used in Node as well. ## KV Connect URLs Connecting to a KV database outside of Deno requires a [KV Connect](https://github.com/denoland/denokv/blob/main/proto/kv-connect.md) URL. A KV Connect URL for a database hosted on Deno Deploy will be in this format: `https://api.deno.com/databases//connect`. The `database-id` for your project can be found in the [Deno Deploy dashboard](https://dash.deno.com/projects), under the project's "KV" tab. ![Connection string locations in Deploy](./images/kv-connect.png) ## More information More information about how to use the Deno KV module for Node can be found on the project's [README page](https://www.npmjs.com/package/@deno/kv). --- # KV on Deno Deploy URL: https://docs.deno.com/deploy/kv/manual/on_deploy Deno Deploy now offers a built-in serverless key-value database called Deno KV. Additionally, Deno KV is available within Deno itself, utilizing SQLite as its backend. This feature has been accessible since Deno v1.32 with the `--unstable` flag. Learn more about [Deno KV](/deploy/kv/manual). ## Consistency Deno KV, by default, is a strongly-consistent database. It provides the strictest form of strong consistency called _external consistency_, which implies: - **Serializability**: This is the highest level of isolation for transactions. It ensures that the concurrent execution of multiple transactions results in a system state that would be the same as if the transactions were executed sequentially, one after another. In other words, the end result of serializable transactions is equivalent to some sequential order of these transactions. - **Linearizability**: This consistency model guarantees that operations, such as read and write, appear to be instantaneous and occur in real-time. Once a write operation completes, all subsequent read operations will immediately return the updated value. Linearizability ensures a strong real-time ordering of operations, making the system more predictable and easier to reason about. Meanwhile, you can choose to relax consistency constraints by setting the `consistency: "eventual"` option on individual read operations. This option allows the system to serve the read from global replicas and caches for minimal latency. Below are the latency figures observed in our top regions: | Region | Latency (Eventual Consistency) | Latency (Strong Consistency) | | -------------------------- | ------------------------------ | ---------------------------- | | North Virginia (us-east4) | 7ms | 7ms | | Frankfurt (europe-west3) | 7ms | 94ms | | Netherlands (europe-west4) | 13ms | 95ms | | California (us-west2) | 72ms | 72ms | | Hong Kong (asia-east2) | 42ms | 194ms | ## Distributed queues Serverless distributed queues are available on Deno Deploy. See [Queues on Deno Deploy](/deploy/kv/manual/queue_overview#queues-on-deno-deploy) for more details. ## Connect to managed databases from outside of Deno Deploy You can connect to your Deno Deploy KV database from your Deno application outside of Deno Deploy. To open a managed database, set the `DENO_KV_ACCESS_TOKEN` environment variable to a Deno Deploy personal access token and provide the URL of the database to `Deno.openKv`: ```ts const kv = await Deno.openKv( "https://api.deno.com/databases//connect", ); ``` Please check the [docs](https://github.com/denoland/deno/tree/main/ext/kv#kv-connect) for the specification of the protocol for connecting to a remote KV database ## Data distribution Deno KV databases are replicated across at least 6 data centers, spanning 3 regions (US, Europe, and Asia). Once a write operation is committed, its mutations are persistently stored in a minimum of two data centers within the primary region. Asynchronous replication typically transfers these mutations to the other two regions in under 10 seconds. The system is designed to tolerate most data center-level failures without experiencing downtime or data loss. Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO) help quantify the system's resilience under various failure modes. RPO represents the maximum acceptable amount of data loss measured in time, whereas RTO signifies the maximum acceptable time required to restore the system to normal operations after a failure. - Loss of one data center in the primary region: RPO=0 (no data loss), RTO<5s (system restoration in under 5 seconds) - Loss of any number of data centers in a replica region: RPO=0, RTO<5s - Loss of two or more data centers in the primary region: RPO<60s (under 60 seconds of data loss) --- # Operations URL: https://docs.deno.com/deploy/kv/manual/operations The Deno KV API provides a set of operations that can be performed on the key space. There are two operations that read data from the store, and five operations that write data to the store. Read operations can either be performed in strong or eventual consistency mode. Strong consistency mode guarantees that the read operation will return the most recently written value. Eventual consistency mode may return a stale value, but is faster. Write operations are always performed in strong consistency mode. ## `get` The `get` operation returns the value and versionstamp associated with a given key. If a value does not exist, get returns a `null` value and versionstamp. There are two APIs that can be used to perform a `get` operation. The [`Deno.Kv.prototype.get(key, options?)`][get] API, which can be used to read a single key, and the [`Deno.Kv.prototype.getMany(keys, options?)`][getMany] API, which can be used to read multiple keys at once. Get operations are performed as a "snapshot read" in all consistency modes. This means that when retrieving multiple keys at once, the values returned will be consistent with each other. ```ts const res = await kv.get(["config"]); console.log(res); // { key: ["config"], value: "value", versionstamp: "000002fa526aaccb0000" } const res = await kv.get(["config"], { consistency: "eventual" }); console.log(res); // { key: ["config"], value: "value", versionstamp: "000002fa526aaccb0000" } const [res1, res2, res3] = await kv.getMany<[string, string, string]>([ ["users", "sam"], ["users", "taylor"], ["users", "alex"], ]); console.log(res1); // { key: ["users", "sam"], value: "sam", versionstamp: "00e0a2a0f0178b270000" } console.log(res2); // { key: ["users", "taylor"], value: "taylor", versionstamp: "0059e9035e5e7c5e0000" } console.log(res3); // { key: ["users", "alex"], value: "alex", versionstamp: "00a44a3c3e53b9750000" } ``` ## `list` The `list` operation returns a list of keys that match a given selector. The associated values and versionstamps for these keys are also returned. There are 2 different selectors that can be used to filter the keys matched. The `prefix` selector matches all keys that start with the given prefix key parts, but not inclusive of an exact match of the key. The prefix selector may optionally be given a `start` OR `end` key to limit the range of keys returned. The `start` key is inclusive, and the `end` key is exclusive. The `range` selector matches all keys that are lexicographically between the given `start` and `end` keys. The `start` key is inclusive, and the `end` key is exclusive. > Note: In the case of the prefix selector, the `prefix` key must consist only > of full (not partial) key parts. For example, if the key `["foo", "bar"]` > exists in the store, then the prefix selector `["foo"]` will match it, but the > prefix selector `["f"]` will not. The list operation may optionally be given a `limit` to limit the number of keys returned. List operations can be performed using the [`Deno.Kv.prototype.list(selector, options?)`][list] method. This method returns a `Deno.KvListIterator` that can be used to iterate over the keys returned. This is an async iterator, and can be used with `for await` loops. ```ts // Return all users const iter = kv.list({ prefix: ["users"] }); const users = []; for await (const res of iter) users.push(res); console.log(users[0]); // { key: ["users", "alex"], value: "alex", versionstamp: "00a44a3c3e53b9750000" } console.log(users[1]); // { key: ["users", "sam"], value: "sam", versionstamp: "00e0a2a0f0178b270000" } console.log(users[2]); // { key: ["users", "taylor"], value: "taylor", versionstamp: "0059e9035e5e7c5e0000" } // Return the first 2 users const iter = kv.list({ prefix: ["users"] }, { limit: 2 }); const users = []; for await (const res of iter) users.push(res); console.log(users[0]); // { key: ["users", "alex"], value: "alex", versionstamp: "00a44a3c3e53b9750000" } console.log(users[1]); // { key: ["users", "sam"], value: "sam", versionstamp: "00e0a2a0f0178b270000" } // Return all users lexicographically after "taylor" const iter = kv.list({ prefix: ["users"], start: ["users", "taylor"] }); const users = []; for await (const res of iter) users.push(res); console.log(users[0]); // { key: ["users", "taylor"], value: "taylor", versionstamp: "0059e9035e5e7c5e0000" } // Return all users lexicographically before "taylor" const iter = kv.list({ prefix: ["users"], end: ["users", "taylor"] }); const users = []; for await (const res of iter) users.push(res); console.log(users[0]); // { key: ["users", "alex"], value: "alex", versionstamp: "00a44a3c3e53b9750000" } console.log(users[1]); // { key: ["users", "sam"], value: "sam", versionstamp: "00e0a2a0f0178b270000" } // Return all users starting with characters between "a" and "n" const iter = kv.list({ start: ["users", "a"], end: ["users", "n"] }); const users = []; for await (const res of iter) users.push(res); console.log(users[0]); // { key: ["users", "alex"], value: "alex", versionstamp: "00a44a3c3e53b9750000" } ``` The list operation reads data from the store in batches. The size of each batch can be controlled using the `batchSize` option. The default batch size is 500 keys. Data within a batch is read in a single snapshot read, so the values are consistent with each other. Consistency modes apply to each batch of data read. Across batches, data is not consistent. The borders between batches is not visible from the API as the iterator returns individual keys. The list operation can be performed in reverse order by setting the `reverse` option to `true`. This will return the keys in lexicographically descending order. The `start` and `end` keys are still inclusive and exclusive respectively, and are still interpreted as lexicographically ascending. ```ts // Return all users in reverse order, ending with "sam" const iter = kv.list({ prefix: ["users"], start: ["users", "sam"] }, { reverse: true, }); const users = []; for await (const res of iter) users.push(res); console.log(users[0]); // { key: ["users", "taylor"], value: "taylor", versionstamp: "0059e9035e5e7c5e0000" } console.log(users[1]); // { key: ["users", "sam"], value: "sam", versionstamp: "00e0a2a0f0178b270000" } ``` > Note: in the above example we set the `start` key to `["users", "sam"]`, even > though the first key returned is `["users", "taylor"]`. This is because the > `start` and `end` keys are always evaluated in lexicographically ascending > order, even when the list operation is performed in reverse order (which > returns the keys in lexicographically descending order). ## `set` The `set` operation sets the value of a key in the store. If the key does not exist, it is created. If the key already exists, its value is overwritten. The `set` operation can be performed using the [`Deno.Kv.prototype.set(key, value)`][set] method. This method returns a `Promise` that resolves to a `Deno.KvCommitResult` object, which contains the `versionstamp` of the commit. Set operations are always performed in strong consistency mode. ```ts const res = await kv.set(["users", "alex"], "alex"); console.log(res.versionstamp); // "00a44a3c3e53b9750000" ``` ## `delete` The `delete` operation deletes a key from the store. If the key does not exist, the operation is a no-op. The `delete` operation can be performed using the [`Deno.Kv.prototype.delete(key)`][delete] method. Delete operations are always performed in strong consistency mode. ```ts await kv.delete(["users", "alex"]); ``` ## `sum` The `sum` operation atomically adds a value to a key in the store. If the key does not exist, it is created with the value of the sum. If the key already exists, its value is added to the sum. The `sum` operation can only be performed as part of an atomic operation. The [`Deno.AtomicOperation.prototype.mutate({ type: "sum", value })`][mutate] method can be used to add a sum mutation to an atomic operation. The sum operation can only be performed on values of type `Deno.KvU64`. Both the operand and the value in the store must be of type `Deno.KvU64`. If the new value of the key is greater than `2^64 - 1` or less than `0`, the sum operation wraps around. For example, if the value in the store is `2^64 - 1` and the operand is `1`, the new value will be `0`. Sum operations are always performed in strong consistency mode. ```ts await kv.atomic() .mutate({ type: "sum", key: ["accounts", "alex"], value: new Deno.KvU64(100n), }) .commit(); ``` ## `min` The `min` operation atomically sets a key to the minimum of its current value and a given value. If the key does not exist, it is created with the given value. If the key already exists, its value is set to the minimum of its current value and the given value. The `min` operation can only be performed as part of an atomic operation. The [`Deno.AtomicOperation.prototype.mutate({ type: "min", value })`][mutate] method can be used to add a min mutation to an atomic operation. The min operation can only be performed on values of type `Deno.KvU64`. Both the operand and the value in the store must be of type `Deno.KvU64`. Min operations are always performed in strong consistency mode. ```ts await kv.atomic() .mutate({ type: "min", key: ["accounts", "alex"], value: new Deno.KvU64(100n), }) .commit(); ``` ## `max` The `max` operation atomically sets a key to the maximum of its current value and a given value. If the key does not exist, it is created with the given value. If the key already exists, its value is set to the maximum of its current value and the given value. The `max` operation can only be performed as part of an atomic operation. The [`Deno.AtomicOperation.prototype.mutate({ type: "max", value })`][mutate] method can be used to add a max mutation to an atomic operation. The max operation can only be performed on values of type `Deno.KvU64`. Both the operand and the value in the store must be of type `Deno.KvU64`. Max operations are always performed in strong consistency mode. ```ts await kv.atomic() .mutate({ type: "max", key: ["accounts", "alex"], value: new Deno.KvU64(100n), }) .commit(); ``` ## `watch` The `watch` operation accepts an array of keys, and returns a [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream), which emits a new value whenever any of the watched keys change their `versionstamp`. The emitted value is an array of [Deno.KvEntryMaybe](https://docs.deno.com/api/deno/~/Deno.KvEntryMaybe) objects. Note that the returned stream does not return every single intermediate state of the watched keys, but keeps you up to date with the latest state of keys. This means if a key is modified multiple times quickly, you may not receive a notification for every change, but the latest state of the key. ```ts const db = await Deno.openKv(); const stream = db.watch([["foo"], ["bar"]]); for await (const entries of stream) { entries[0].key; // ["foo"] entries[0].value; // "bar" entries[0].versionstamp; // "00000000000000010000" entries[1].key; // ["bar"] entries[1].value; // null entries[1].versionstamp; // null } ``` [get]: https://docs.deno.com/api/deno/~/Deno.Kv.prototype.get [getMany]: https://docs.deno.com/api/deno/~/Deno.Kv.prototype.getMany [list]: https://docs.deno.com/api/deno/~/Deno.Kv.prototype.list [set]: https://docs.deno.com/api/deno/~/Deno.Kv.prototype.set [delete]: https://docs.deno.com/api/deno/~/Deno.Kv.prototype.delete [mutate]: https://docs.deno.com/api/deno/~/Deno.AtomicOperation.prototype.mutate --- # Using Queues URL: https://docs.deno.com/deploy/kv/manual/queue_overview The Deno runtime includes a queueing API that supports offloading larger workloads for async processing, with guaranteed at-least-once delivery of queued messages. Queues can be used to offload tasks in a web application, or to schedule units of work for a time in the future. The primary APIs you'll use with queues are in the `Deno.Kv` namespace as [`enqueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.enqueue) and [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue). ## Enqueue a message To enqueue a message for processing, use the `enqueue` method on an instance of [`Deno.Kv`](https://docs.deno.com/api/deno/~/Deno.Kv). In the example below, we show what it might look like to enqueue a notification for delivery. ```ts title="queue_example.ts" // Describe the shape of your message object (optional) interface Notification { forUser: string; body: string; } // Get a reference to a KV instance const kv = await Deno.openKv(); // Create a notification object const message: Notification = { forUser: "alovelace", body: "You've got mail!", }; // Enqueue the message for immediate delivery await kv.enqueue(message); ``` You can enqueue a message for later delivery by specifying a `delay` option in milliseconds. ```ts // Enqueue the message for delivery in 3 days const delay = 1000 * 60 * 60 * 24 * 3; await kv.enqueue(message, { delay }); ``` You can also specify a key in Deno KV where your message value will be stored if your message isn't delivered for any reason. ```ts // Configure a key where a failed message would be sent const backupKey = ["failed_notifications", "alovelace", Date.now()]; await kv.enqueue(message, { keysIfUndelivered: [backupKey] }); // ... disaster strikes ... // Get the unsent message const r = await kv.get(backupKey); // This is the message that didn't get sent: console.log("Found failed notification for:", r.value?.forUser); ``` ## Listening for messages You can configure a JavaScript function that will process items added to your queue with the `listenQueue` method on an instance of [`Deno.Kv`](https://docs.deno.com/api/deno/~/Deno.Kv). ```ts title="listen_example.ts" // Define the shape of the object we expect as a message in the queue interface Notification { forUser: string; body: string; } // Create a type guard to check the type of the incoming message function isNotification(o: unknown): o is Notification { return ( ((o as Notification)?.forUser !== undefined && typeof (o as Notification).forUser === "string") && ((o as Notification)?.body !== undefined && typeof (o as Notification).body === "string") ); } // Get a reference to a KV database const kv = await Deno.openKv(); // Register a handler function to listen for values - this example shows // how you might send a notification kv.listenQueue((msg: unknown) => { // Use type guard - then TypeScript compiler knows msg is a Notification if (isNotification(msg)) { console.log("Sending notification to user:", msg.forUser); // ... do something to actually send the notification! } else { // If the message is of an unknown type, it might be an error console.error("Unknown message received:", msg); } }); ``` ## Queue API with KV atomic transactions You can combine the queue API with [KV atomic transactions](./transactions) to atomically enqueue messages and modify keys in the same transaction. ```ts title="kv_transaction_example.ts" const kv = await Deno.openKv(); kv.listenQueue(async (msg: unknown) => { const nonce = await kv.get(["nonces", msg.nonce]); if (nonce.value === null) { // This messaged was already processed return; } const change = msg.change; const bob = await kv.get(["balance", "bob"]); const liz = await kv.get(["balance", "liz"]); const success = await kv.atomic() // Ensure this message was not yet processed .check({ key: nonce.key, versionstamp: nonce.versionstamp }) .delete(nonce.key) .sum(["processed_count"], 1n) .check(bob, liz) // balances did not change .set(["balance", "bob"], bob.value - change) .set(["balance", "liz"], liz.value + change) .commit(); }); // Modify keys and enqueue messages in the same KV transaction! const nonce = crypto.randomUUID(); await kv .atomic() .check({ key: ["nonces", nonce], versionstamp: null }) .enqueue({ nonce: nonce, change: 10 }) .set(["nonces", nonce], true) .sum(["enqueued_count"], 1n) .commit(); ``` ## Queue behavior ### Message delivery guarantees The runtime guarantees at-least-once delivery. This means that for majority of enqueued messages, the [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue) handler will be invoked once for each message. In some failure scenarios, the handler may be invoked multiple times for the same message to ensure delivery. It's important to design your applications such that duplicate messages are handled correctly. You may use queues in combination with [KV atomic transactions](https://docs.deno.com/deploy/kv/manual/transactions) primitives to ensure that your queue handler KV updates are performed exactly once per message. See [Queue API with KV atomic transactions](#queue-api-with-kv-atomic-transactions). ### Automatic retries [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue) handler is invoked to process your queued messages when they're ready for delivery. If your handler throws an exception the runtime will automatically retry to call the handler again until it succeeds or until maximum retry attempts are reached. The message is considered to be successfully processed once the [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue) handler invocation completes successfully. The message will be dropped if the handler consistently fails on retries. ### Message delivery order The runtime makes best effort to deliver messages in the order they were enqueued. However, there is not strict order guarantee. Occasionally, messages may be delivered out of order to ensure maximum throughput. ## Queues on Deno Deploy Deno Deploy offers global, serverless, distributed implementation of the queueing API, designed for high availability and throughput. You can use it to build applications that scale to handle large workloads. ### Just-in-time isolate spin-up When using queues with Deno Deploy, isolates are automatically spun up on demand to invoke your [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue) handler when a message becomes available for processing. Defining [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue) handler is the only requirement to enable queue processing in your Deno Deploy application, no additional configuration is needed. ### Queue size limit The maximum number of undelivered queue messages is limited to 100,000. [`enqueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.enqueue) method will fail with an error if the queue is full. ### Pricing details and limits - [`enqueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.enqueue) is treated just like other [`Deno.Kv`](https://docs.deno.com/api/deno/~/Deno.Kv) write operations. Enqueued messages consume KV storage and write units. - Messages delivered through [`listenQueue`](https://docs.deno.com/api/deno/~/Deno.Kv.prototype.listenQueue) consume requests and KV write units. - See [Pricing details](https://deno.com/deploy/pricing) for more information. ## Use cases Queues can be useful in many different scenarios, but there are a few use cases you might see a lot when building web applications. ### Offloading async processes Sometimes a task that's initiated by a client (like sending a notification or API request), may take long enough where you don't want to make clients wait for that task to be completed before returning a response. Other times, clients don't actually need a response at all, such as when a client is sending your application a [webhook request](https://en.wikipedia.org/wiki/Webhook), so there's no need to wait for the underlying task to be completed before returning a response. In these cases, you can offload work to a queue to keep your web application responsive and send immediate feedback to clients. To see an example of this use case in action, check out our [webhook processing example](../tutorials/webhook_processor.md). ### Scheduling work for the future Another helpful application of queues (and queue APIs like this one), is to schedule work to happen at an appropriate time in the future. Maybe you'd like to send a notification to a new customer a day after they have placed an order to send them a satisfaction survey. You can schedule a queue message to be delivered 24 hours into the future, and set up a listener to send out the notification at that time. To see an example of scheduling a notification to go out in the future, check out our [notification example](../tutorials/schedule_notification.md). --- # Secondary Indexes URL: https://docs.deno.com/deploy/kv/manual/secondary_indexes Key-value stores like Deno KV organize data as collections of key-value pairs, where each unique key is associated with a single value. This structure enables easy retrieval of values based on their keys but does not allow for querying based on the values themselves. To overcome this constraint, you can create secondary indexes, which store the same value under additional keys that include (part of) that value. Maintaining consistency between primary and secondary keys is crucial when using secondary indexes. If a value is updated at the primary key without updating the secondary key, the data returned from a query targeting the secondary key will be incorrect. To ensure that primary and secondary keys always represent the same data, use atomic operations when inserting, updating, or deleting data. This approach ensures that the group of mutation actions are executed as a single unit, and either all succeed or all fail, preventing inconsistencies. ## Unique indexes (one-to-one) Unique indexes have each key in the index associated with exactly one primary key. For example, when storing user data and looking up users by both their unique IDs and email addresses, store user data under two separate keys: one for the primary key (user ID) and another for the secondary index (email). This setup allows querying users based on either their ID or their email. The secondary index can also enforce uniqueness constraints on values in the store. In the case of user data, use the index to ensure that each email address is associated with only one user - in other words that emails are unique. To implement a unique secondary index for this example, follow these steps: 1. Create a `User` interface representing the data: ```ts interface User { id: string; name: string; email: string; } ``` 2. Define an `insertUser` function that stores user data at both the primary and secondary keys: ```ts async function insertUser(user: User) { const primaryKey = ["users", user.id]; const byEmailKey = ["users_by_email", user.email]; const res = await kv.atomic() .check({ key: primaryKey, versionstamp: null }) .check({ key: byEmailKey, versionstamp: null }) .set(primaryKey, user) .set(byEmailKey, user) .commit(); if (!res.ok) { throw new TypeError("User with ID or email already exists"); } } ``` > This function performs the insert using an atomic operation that checks > that no user with the same ID or email already exists. If either of these > constraints is violated, the insert fails and no data is modified. 3. Define a `getUser` function to retrieve a user by their ID: ```ts async function getUser(id: string): Promise { const res = await kv.get(["users", id]); return res.value; } ``` 4. Define a `getUserByEmail` function to retrieve a user by their email address: ```ts async function getUserByEmail(email: string): Promise { const res = await kv.get(["users_by_email", email]); return res.value; } ``` This function queries the store using the secondary key (`["users_by_email", email]`). 5. Define a deleteUser function to delete users by their ID: ```ts async function deleteUser(id: string) { let res = { ok: false }; while (!res.ok) { const getRes = await kv.get(["users", id]); if (getRes.value === null) return; res = await kv.atomic() .check(getRes) .delete(["users", id]) .delete(["users_by_email", getRes.value.email]) .commit(); } } ``` > This function first retrieves the user by their ID to get the users email > address. This is needed to retrieve the email that is needed to construct > the key for the secondary index for this user address. It then performs an > atomic operation that checks that the user in the database has not changed, > and then deletes both the primary and secondary key pointing to the user > value. If this fails (the user has been modified between query and delete), > the atomic operation aborts. The entire procedure is retried until the > delete succeeds. The check is required to prevent race conditions where > value may have been modified between the retrieve and delete. This race can > occur if an update changes the user's email, because the secondary index > moves in this case. The delete of the secondary index then fails, because > the delete is targeting the old secondary index key. ## Non-Unique Indexes (One-to-Many) Non-unique indexes are secondary indexes where a single key can be associated with multiple primary keys, allowing you to query for multiple items based on a shared attribute. For example, when querying users by their favorite color, implement this using a non-unique secondary index. The favorite color is a non-unique attribute since multiple users can have the same favorite color. To implement a non-unique secondary index for this example, follow these steps: 1. Define the `User` interface: ```ts interface User { id: string; name: string; favoriteColor: string; } ``` 2. Define the `insertUser` function: ```ts async function insertUser(user: User) { const primaryKey = ["users", user.id]; const byColorKey = [ "users_by_favorite_color", user.favoriteColor, user.id, ]; await kv.atomic() .check({ key: primaryKey, versionstamp: null }) .set(primaryKey, user) .set(byColorKey, user) .commit(); } ``` 3. Define a function to retrieve users by their favorite color: ```ts async function getUsersByFavoriteColor(color: string): Promise { const iter = kv.list({ prefix: ["users_by_favorite_color", color] }); const users = []; for await (const { value } of iter) { users.push(value); } return users; } ``` This example demonstrates the use of a non-unique secondary index, `users_by_favorite_color`, which allows querying users based on their favorite color. The primary key remains the user `id`. The primary difference between the implementation of unique and non-unique indexes lies in the structure and organization of the secondary keys. In unique indexes, each secondary key is associated with exactly one primary key, ensuring that the indexed attribute is unique across all records. In the case of non-unique indexes, a single secondary key can be associated with multiple primary keys, as the indexed attribute may be shared among multiple records. To achieve this, non-unique secondary keys are typically structured with an additional unique identifier (e.g., primary key) as part of the key, allowing multiple records with the same attribute to coexist without conflicts. --- # Transactions URL: https://docs.deno.com/deploy/kv/manual/transactions The Deno KV store utilizes _optimistic concurrency control transactions_ rather than _interactive transactions_ like many SQL systems like PostgreSQL or MySQL. This approach employs versionstamps, which represent the current version of a value for a given key, to manage concurrent access to shared resources without using locks. When a read operation occurs, the system returns a versionstamp for the associated key in addition to the value. To execute a transaction, one performs an atomic operations that can consist of multiple mutation actions (like set or delete). Along with these actions, key+versionstamp pairs are provided as a condition for the transaction's success. The optimistic concurrency control transaction will only commit if the specified versionstamps match the current version for the values in the database for the corresponding keys. This transaction model ensures data consistency and integrity while allowing concurrent interactions within the Deno KV store. Because OCC transactions are optimistic, they can fail on commit because the version constraints specified in the atomic operation were violated. This occurs when an agent updates a key used within the transaction between read and commit. When this happens, the agent performing the transaction must retry the transaction. To illustrate how to use OCC transactions with Deno KV, this example shows how to implement a `transferFunds(from: string, to: string, amount: number)` function for an account ledger. The account ledger stores the balance for each account in the key-value store. The keys are prefixed by `"account"`, followed by the account identifier: `["account", "alice"]`. The value stored for each key is a number that represents the account balance. Here's a step-by-step example of implementing this `transferFunds` function: ```ts async function transferFunds(sender: string, receiver: string, amount: number) { if (amount <= 0) throw new Error("Amount must be positive"); // Construct the KV keys for the sender and receiver accounts. const senderKey = ["account", sender]; const receiverKey = ["account", receiver]; // Retry the transaction until it succeeds. let res = { ok: false }; while (!res.ok) { // Read the current balance of both accounts. const [senderRes, receiverRes] = await kv.getMany([senderKey, receiverKey]); if (senderRes.value === null) { throw new Error(`Account ${sender} not found`); } if (receiverRes.value === null) { throw new Error(`Account ${receiver} not found`); } const senderBalance = senderRes.value; const receiverBalance = receiverRes.value; // Ensure the sender has a sufficient balance to complete the transfer. if (senderBalance < amount) { throw new Error( `Insufficient funds to transfer ${amount} from ${sender}`, ); } // Perform the transfer. const newSenderBalance = senderBalance - amount; const newReceiverBalance = receiverBalance + amount; // Attempt to commit the transaction. `res` returns an object with // `ok: false` if the transaction fails to commit due to a check failure // (i.e. the versionstamp for a key has changed) res = await kv.atomic() .check(senderRes) // Ensure the sender's balance hasn't changed. .check(receiverRes) // Ensure the receiver's balance hasn't changed. .set(senderKey, newSenderBalance) // Update the sender's balance. .set(receiverKey, newReceiverBalance) // Update the receiver's balance. .commit(); } } ``` In this example, the `transferFunds` function reads the balances and versionstamps of both accounts, calculates the new balances after the transfer, and checks if there are sufficient funds in account A. It then performs an atomic operation, setting the new balances with the versionstamp constraints. If the transaction is successful, the loop exits. If the version constraints are violated, the transaction fails, and the loop retries the transaction until it succeeds. ## Limits In addition to a max key size of 2 KiB and max value size of 64 KiB, there are certain limits with the Deno KV transaction API: - **Max keys per `kv.getMany()`**: 10 - **Max batch size per `kv.list()`**: 1000 - **Max checks in an atomic operation**: 100 - **Max mutations in an atomic operation**: 1000 - **Max total size of an atomic operation**: 800 KiB. This includes all keys and values in checks and mutations, and encoding overhead counts toward this limit as well. - **Max total size of keys**: 90 KiB. This includes all keys in checks and mutations, and encoding overhead counts toward this limit as well. - **Max watched keys per `kv.watch()`**:10 --- # Deno KV Tutorials & Examples URL: https://docs.deno.com/deploy/kv/tutorials/ Check out these examples showing real-world usage of Deno KV. ## Use queues to process incoming webhooks Follow [this tutorial](./webhook_processor.md) to learn how to use queues to offload tasks to a background process, so your web app can remain responsive. This example shows how to enqueue tasks that handle incoming webhook requests from [GitHub](https://www.github.com). ## Use queues to schedule a future notification Follow [this tutorial](./schedule_notification.md) to learn how to schedule code to execute at some time in the future using queues. This example shows how to schedule a notification with [Courier](https://www.courier.com/). ## CRUD in Deno KV - TODO List - Zod schema validation - Built using Fresh - Real-time collaboration using BroadcastChannel - [Source code](https://github.com/denoland/showcase_todo) - [Live preview](https://showcase-todo.deno.dev/) ## Deno SaaSKit - Modern SaaS template built on Fresh. - [Product Hunt](https://www.producthunt.com/)-like template entirely built on KV. - Uses Deno KV OAuth for GitHub OAuth 2.0 authentication - Use to launch your next app project faster - [Source code](https://github.com/denoland/saaskit) - [Live preview](https://hunt.deno.land/) ## Multi-player Tic-Tac-Toe - GitHub authentication - Saved user state - Real-time sync using BroadcastChannel - [Source code](https://github.com/denoland/tic-tac-toe) - [Live preview](https://tic-tac-toe-game.deno.dev/) ## Multi-user pixel art drawing - Persistent canvas state - Multi-user collaboration - Real-time sync using BroadcastChannel - [Source code](https://github.com/denoland/pixelpage) - [Live preview](https://pixelpage.deno.dev/) ## GitHub authentication and KV - Stores drawings in KV - GitHub authentication - [Source code](https://github.com/hashrock/kv-sketchbook) - [Live preview](https://hashrock-kv-sketchbook.deno.dev/) ## Deno KV oAuth 2 - High-level OAuth 2.0 powered by Deno KV - [Source code](https://github.com/denoland/deno_kv_oauth) - [Live preview](https://kv-oauth.deno.dev/) --- # Schedule a notification for a future date URL: https://docs.deno.com/deploy/kv/tutorials/schedule_notification A common use case for [queues](../manual/queue_overview.md) is scheduling work to be completed at some point in the future. To help demonstrate how this works, we've provided a sample application (described below) that schedules notification messages sent through the [Courier API](https://www.courier.com/). The application runs on [Deno Deploy](https://deno.com/deploy), using the built-in KV and queue API implementations available there with zero configuration. ## Download and configure the sample ⬇️ [**Download or clone the complete sample app here**](https://github.com/kwhinnery/deno_courier_example). You can run and deploy this sample application yourself using the instructions in the GitHub repo's [`README` file](https://github.com/kwhinnery/deno_courier_example). To run the example app above, you'll also need to [sign up for Courier](https://app.courier.com/signup). Of course the techniques you'll see in the application would just as easily apply to any notification service, from [Amazon SNS](https://aws.amazon.com/sns/) to [Twilio](https://www.twilio.com), but Courier provides an easy-to-use notification API that you can use with a personal GMail account for testing (in addition to all the other neat things it can do). ## Key functionality After setting up and running the project, we'd like to direct your attention to a few key parts of the code that implement the scheduling mechanics. ### Connecting to KV and adding a listener on app start Most of the example app's functionality lives in [server.tsx](https://github.com/kwhinnery/deno_courier_example/blob/main/server.tsx) in the top-level directory. When the Deno app process starts, it creates a connection to a Deno KV instance and attaches an event handler which will process messages as they are received from the queue. ```ts title="server.tsx" // Create a Deno KV database reference const kv = await Deno.openKv(); // Create a queue listener that will process enqueued messages kv.listenQueue(async (message) => { /* ... implementation of listener here ... */ }); ``` ### Creating and scheduling a notification After a new order is submitted through the form in this demo application, the `enqueue` function is called with a delay of five seconds before a notification email is sent out. ```ts title="server.tsx" app.post("/order", async (c) => { const { email, order } = await c.req.parseBody(); const n: Notification = { email: email as string, body: `Order received for: "${order as string}"`, }; // Select a time in the future - for now, just wait 5 seconds const delay = 1000 * 5; // Enqueue the message for processing! kv.enqueue(n, { delay }); // Redirect back home with a success message! setCookie(c, "flash_message", "Order created!"); return c.redirect("/"); }); ``` ### Defining the notification data type in TypeScript Often, it is desirable to work with strongly typed objects when pushing data into or out of the queue. While queue messages are an [`unknown`](https://www.typescriptlang.org/docs/handbook/2/functions.html#unknown) TypeScript type initially, we can use [type guards](https://www.typescriptlang.org/docs/handbook/2/narrowing.html) to tell the compiler the shape of the data we expect. Here's the source code for the [notification module](https://github.com/kwhinnery/deno_courier_example/blob/main/notification.ts), which we use to describe the properties of a notification in our system. ```ts title="notification.ts" // Shape of a notification object export default interface Notification { email: string; body: string; } // Type guard for a notification object export function isNotification(o: unknown): o is Notification { return ( ((o as Notification)?.email !== undefined && typeof (o as Notification).email === "string") && ((o as Notification)?.body !== undefined && typeof (o as Notification).body === "string") ); } ``` In `server.tsx`, we use the exported type guard to ensure we are responding to the right message types. ```ts title="server.tsx" kv.listenQueue(async (message) => { // Use type guard to short circuit early if the message is of the wrong type if (!isNotification(message)) return; // Grab the relevant data from the message, which TypeScript now knows // is a Notification interface const { email, body } = message; // Create an email notification with Courier // ... }); ``` ### Sending a Courier API request To send an email as scheduled, we use the Courier REST API. More information about the Courier REST API can be found in [their reference docs](https://www.courier.com/docs/reference/send/message/). ```ts title="server.tsx" const response = await fetch("https://api.courier.com/send", { method: "POST", headers: { Authorization: `Bearer ${COURIER_API_TOKEN}`, }, body: JSON.stringify({ message: { to: { email }, content: { title: "New order placed by Deno!", body: "notification body goes here", }, }, }), }); ``` --- # Offload webhook processing to a queue URL: https://docs.deno.com/deploy/kv/tutorials/webhook_processor In a web application, it is often desirable to offload processing of async tasks for which a client doesn't need an immediate response to a queue. Doing so can keep your web app fast and responsive, instead of taking up valuable resources waiting for long-running processes to complete. One instance where you might want to deploy this technique is when [handling webhooks](https://en.wikipedia.org/wiki/Webhook). Immediately upon receiving the webhook request from a non-human client that doesn't need a response, you can offload that work to a queue where it can be handled more efficiently. In this tutorial, we'll show you how to execute this technique when [handling webhook requests for a GitHub repo](https://docs.github.com/en/webhooks/about-webhooks-for-repositories). ## Try in a playground ✏️ [**Check out the this playground, which implements a GitHub repo webhook handler**](https://dash.deno.com/playground/github-webhook-example). Using Deno Deploy [playgrounds](/deploy/manual/playgrounds), you can instantly deploy your own GitHub webhook handler that uses both queues and Deno KV. We'll walk through what this code does in a moment. ## Configuring GitHub webhooks for a repository To try out the webhook you just launched in a playground, set up a new webhook configuration for a GitHub repository you control. You can find webhook configuration under "Settings" for your repository. ![configure a github webhook](./images/github_webhook.png) ## Code walkthrough Our webhook handler function is relatively simple - without comments, it's only 23 lines of code total. It connects to a Deno KV database, sets up a queue listener to process incoming messages, and sets up a simple server with [`Deno.serve`](https://docs.deno.com/api/deno/~/Deno.serve) which responds to incoming webhook requests. Read along with the comments below to see what's happening at each step. ```ts title="server.ts" // Get a handle for a Deno KV database instance. KV is built in to the Deno // runtime, and is available with zero config both locally and on Deno Deploy const kv = await Deno.openKv(); // Set up a listener that will handle work that is offloaded from our server. // In this case, it's just going to add incoming webhook payloads to a KV // database, with a timestamp. kv.listenQueue(async (message) => { await kv.set(["github", Date.now()], message); }); // This is a simple HTTP server that will handle incoming POST requests from // GitHub webhooks. Deno.serve(async (req: Request) => { if (req.method === "POST") { // GitHub sends webhook requests as POST requests to your server. You can // configure GitHub to send JSON in the POST body, which you can then parse // from the request object. const payload = await req.json(); await kv.enqueue(payload); return new Response("", { status: 200 }); } else { // If the server is handling a GET request, this will just list out all the // webhook events that have been recorded in our KV database. const iter = kv.list({ prefix: ["github"] }); const github = []; for await (const res of iter) { github.push({ timestamp: res.key[1], payload: res.value, }); } return new Response(JSON.stringify(github, null, 2)); } }); ``` --- # Acceptable use policy URL: https://docs.deno.com/deploy/manual/acceptable-use-policy The Deno Deploy service includes resources (CPU time, request counts) that are subject to this Acceptable Use policy. This document can give a rough estimate to what we consider as "Acceptable Use", and what we do not. ### Examples of Acceptable Use - ✅ Server-side rendered websites - ✅ Jamstack sites and apps - ✅ Single page applications - ✅ APIs that query a DB or external API - ✅ A personal blog - ✅ A company website - ✅ An e-commerce site - ✅ Reverse proxy ### Not Acceptable Use - ❌ Crypto mining - ❌ Highly CPU-intensive load (e.g. machine learning) - ❌ Media hosting for external sites - ❌ Scrapers - ❌ Forward proxy - ❌ VPN ## Guidelines We expect most projects to fall well within the usage limits. We will notify you if your projects usage significantly deviates from the norm. We will reach out to you where possible before taking any action to address unreasonable burdens on our infrastructure. --- # CI and GitHub Actions URL: https://docs.deno.com/deploy/manual/ci_github Deno Deploy's Git integration enables deployment of code changes that are pushed to a GitHub repository. Commits on the production branch will be deployed as a production deployment. Commits on all other branches will be deployed as a preview deployment. There are two modes of operation for the Git integration: - **Automatic**: Deno Deploy will automatically pull code and assets from your repository source every time you push, and deploy it. This mode is very fast, but does not allow for a build step. _This is the recommended mode for most users._ - **GitHub Actions**: In this mode, you push your code and assets to Deno Deploy from a GitHub Actions workflow. This allows you to perform a build step before deploying. Deno Deploy will select an appropriate mode based on your custom deployment configuration. Below, we go into more detail about the different configurations for **Automatic** and **GitHub Actions** mode. ## Automatic If your project doesn't require any additional build steps, then the system chooses **Automatic** mode. The entrypoint file is simply the file that Deno Deploy will run. ## GitHub Actions If you enter a command in **Install Step** and/or **Build Step** in the **Project Configuration**, Deno Deploy will create a necessary GitHub Actions workflow file and push it into your repository. In this workflow file, we leverage the `deployctl` [Github action][deploy-action] to deploy your project. You can do whatever you need to do, such as running a build command, before deploying it to Deno Deploy. To configure preprocessing commands you want to run, click **Show advanced options** button that appears after choosing your git repository. Then enter values as needed to input boxes. :::tip For example, if you want to enable [ahead-of-time builds] for a Fresh project, you will enter `deno task build` in the **Build Step** box. See also [the Fresh doc][Deploy to production] for deploying a Fresh project to Deno Deploy. ::: The GitHub Actions workflow file that Deno Deploy generates and pushes to your repository looks like as follows. ```yml title=".github/workflows/deploy.yml" name: Deploy on: push: branches: main pull_request: branches: main jobs: deploy: name: Deploy runs-on: ubuntu-latest permissions: id-token: write # Needed for auth with Deno Deploy contents: read # Needed to clone the repository steps: - name: Clone repository uses: actions/checkout@v4 - name: Install Deno uses: denoland/setup-deno@v2 with: deno-version: v2.x - name: Build step run: "deno task build" - name: Upload to Deno Deploy uses: denoland/deployctl@v1 with: project: "" entrypoint: "main.ts" root: "." ``` See [deployctl README](https://github.com/denoland/deployctl/blob/main/action/README.md) for more details. [fileserver]: https://jsr.io/@std/http#file-server [ghapp]: https://github.com/apps/deno-deploy [deploy-action]: https://github.com/denoland/deployctl/blob/main/action/README.md [ahead-of-time builds]: https://fresh.deno.dev/docs/concepts/ahead-of-time-builds [Deploy to production]: https://fresh.deno.dev/docs/getting-started/deploy-to-production --- # Custom domains URL: https://docs.deno.com/deploy/manual/custom-domains By default a project can be reached at its preview URL, which is `$PROJECT_ID.deno.dev`, e.g. `dead-clam-55.deno.dev`. You can also add a custom domain by following the instructions below. ## **Step 1:** Add your custom domain in the Deno Deploy dashboard 1. Click the "Settings" button on the project page, then select "Domains" from the sidebar. 2. Enter the domain name you wish to add to the project and press "Add." Note that you must own the domain that you want to add to a project. If you do not own a domain yet, you can register one at a domain registrar like Google Domains, Namecheap, or gandi.net. ![add_custom_domain](../docs-images/add_custom_domain.png) 3. The domain is added to the domains list and will have a "setup" badge. 4. Click on the "setup" badge to visit the domain setup page, which will display the list of DNS records that need to be created/updated for your domain. ![dns_records_modal](../docs-images/dns_records_modal.png) ## **Step 2:** Update your custom domain's DNS records Go to the DNS configuration panel of your domain registrar (or the service you're using to manage DNS) and enter the records as described on the domain setup page. ![change_dns_records](../docs-images/change_dns_records.png) ## **Step 3:** Validate that the DNS records have been updated Go back to the Deno Deploy dashboard and click the **Validate** button on the domain setup page. It will check if the DNS records are correctly set and if so, update the status to "Validated, awaiting certificate provisioning." ![get_certificates](../docs-images/get_certificates.png) ## **Step 4:** Provision a certificate for your custom domain At this point you have two options. 99% of the time, you should choose the first option. 1. Let us automatically provision a certificate using Let's Encrypt. To do this, press the **Get automatic certificates** button. Provisioning a TLS certificate can take up to a minute. It is possible that the provisioning fails if your domain specifies a CAA record that prevents [Let's Encrypt](https://letsencrypt.org/) from provisioning certificates. Certificates will be automatically renewed around 30 days before the certificate expires. When you have been issued certificates successfully, you will see a green checkmark like this: ![green_check](../docs-images/green_check.png) 2. Manually upload a certificate and private key. To manually upload a certificate chain and private key, press the **Upload your own certificates** button. You will be prompted to upload a certificate chain and private key. The certificate chain needs to be complete and valid, and your leaf certificate needs to be at the top of the chain. --- # Using deployctl on the command line URL: https://docs.deno.com/deploy/manual/deployctl `deployctl` is a command line tool (CLI) that lets you operate the Deno Deploy platform without leaving your terminal. With it you can deploy your code, create and manage your projects and their deployments, and monitor their usage and logs. ## Dependencies The only dependency for `deployctl` is the Deno runtime. You can install it by running the following command: ```sh curl -fsSL https://deno.land/install.sh | sh ``` You don't need to setup a Deno Deploy account beforehand. It will be created along the way when you deploy your first project. ## Install `deployctl` With the Deno runtime installed, you can install the `deployctl` utility with the following command: ```sh deno install -gArf jsr:@deno/deployctl ``` The `-A` option in the deno install command grants all permissions to the installed script. You can opt not to use it, in which case you will be prompted to grant the necessary permissions when needed during the execution of the tool. ## Deploy To perform a new deployment of your code, navigate to the root directory of your project and execute: ```shell deployctl deploy ``` ### Project and Entrypoint If this is the first deployment of the project, `deployctl` will guess the project name based on the Git repo or directory it is in. Similarly, it will guess the entrypoint by looking for files with common entrypoint names (main.ts, src/main.ts, etc). After the first deployment, the settings used will be stored in a config file (by default deno.json). You can specify the project name and/or the entrypoint using the `--project` and `--entrypoint` arguments respectively. If the project does not exist, it will be created automatically. By default it is created in the personal organization of the user, but it can also be created in a custom organization by specifying the `--org` argument. If the organization does not exist yet, it will also be created automatically. ```shell deployctl deploy --project=helloworld --entrypoint=src/entrypoint.ts --org=my-team ``` ### Include and Exclude Files By default, deployctl deploys all the files in the current directory (recursively, except `node_modules` directories). You can customize this behavior using the `--include` and `--exclude` arguments (also supported in the config file). These arguments accept specific files, whole directories and globs. Here are some examples: - Include only source and static files: ```shell deployctl deploy --include=./src --include=./static ``` - Include only Typescript files: ```shell deployctl deploy --include=**/*.ts ``` - Exclude local tooling and artifacts ```shell deployctl deploy --exclude=./tools --exclude=./benches ``` A common pitfall is to not include the source code modules that need to be run (entrypoint and dependencies). The following example will fail because `main.ts` is not included: ```shell deployctl deploy --include=./static --entrypoint=./main.ts ``` The entrypoint can also be a remote script. A common use case for this is to deploy an static site using `std/http/file_server.ts` (more details in [Static Site Tutorial](https://docs.deno.com/deploy/tutorials/static-site)): ```shell deployctl deploy --include=dist --entrypoint=jsr:@std/http/file-server ``` ### Environment variables You can set env variables using `--env` (to set individual environment variables) or `--env-file` (to load one or more environment files). These options can be combined and used multiple times: ```shell deployctl deploy --env-file --env-file=.other-env --env=DEPLOYMENT_TS=$(date +%s) ``` The deployment will have access to these variables using `Deno.env.get()`. Be aware that the env variables set with `--env` and `--env-file` are specific for the deployment being created and are not added to the list of [env variables configured for the project](./environment-variables.md). ### Production Deployments Each deployment you create have a unique URL. In addition, a project has a "production URL" and custom domains routing trafffic to its "production" deployment. Deployments can be promoted to production at any time, or created directly as production using the `--prod` flag: ```shell deployctl deploy --prod ``` Learn more about production deployments in the [Deployments](./deployments) docs. ## Deployments The deployments subcommand groups all the operations around deployments. ### List You can list the deployments of a project with: ```shell deployctl deployments list ``` Output: ``` ✔ Page 1 of the list of deployments of the project 'my-project' is ready ┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐ │ Deployment │ Date │ Status │ Database │ Domain │ Entrypoint │ Branch │ Commit │ ├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤ │ kcbxc4xwe4mc │ 12/3/2024 13:21:40 CET (2 days) │ Preview │ Preview │ https://my-project-kcbxc4xwe4mc.deno.dev │ main.ts │ main │ 4b6c506 │ │ c0ph5xa9exb3 │ 12/3/2024 13:21:25 CET (2 days) │ Production │ Production │ https://my-project-c0ph5xa9exb3.deno.dev │ main.ts │ main │ 4b6c506 │ │ kwkbev9er4h2 │ 12/3/2024 13:21:12 CET (2 days) │ Preview │ Preview │ https://my-project-kwkbev9er4h2.deno.dev │ main.ts │ main │ 4b6c506 │ │ dxseq0jc8402 │ 6/3/2024 23:16:51 CET (8 days) │ Preview │ Production │ https://my-project-dxseq0jc8402.deno.dev │ main.ts │ main │ 099359b │ │ 7xr5thz8yjbz │ 6/3/2024 22:58:32 CET (8 days) │ Preview │ Preview │ https://my-project-7xr5thz8yjbz.deno.dev │ main.ts │ another │ a4d2953 │ │ 4qr4h5ac3rfn │ 6/3/2024 22:57:05 CET (8 days) │ Failed │ Preview │ n/a │ main.ts │ another │ 56d2c88 │ │ 25wryhcqmb9q │ 6/3/2024 22:56:41 CET (8 days) │ Preview │ Preview │ https://my-project-25wryhcqmb9q.deno.dev │ main.ts │ another │ 4b6c506 │ │ 64tbrn8jre9n │ 6/3/2024 8:21:33 CET (8 days) │ Preview │ Production │ https://my-project-64tbrn8jre9n.deno.dev │ main.ts │ main │ 4b6c506 │ │ hgqgccnmzg04 │ 6/3/2024 8:17:40 CET (8 days) │ Failed │ Production │ n/a │ main.ts │ main │ 8071902 │ │ rxkh1w3g74e8 │ 6/3/2024 8:17:28 CET (8 days) │ Failed │ Production │ n/a │ main.ts │ main │ b142a59 │ │ wx6cw9aya64c │ 6/3/2024 8:02:29 CET (8 days) │ Preview │ Production │ https://my-project-wx6cw9aya64c.deno.dev │ main.ts │ main │ b803784 │ │ a1qh5fmew2yf │ 5/3/2024 16:25:29 CET (9 days) │ Preview │ Production │ https://my-project-a1qh5fmew2yf.deno.dev │ main.ts │ main │ 4bb1f0f │ │ w6pf4r0rrdkb │ 5/3/2024 16:07:35 CET (9 days) │ Preview │ Production │ https://my-project-w6pf4r0rrdkb.deno.dev │ main.ts │ main │ 6e487fc │ │ nn700gexgdzq │ 5/3/2024 13:37:11 CET (9 days) │ Preview │ Production │ https://my-project-nn700gexgdzq.deno.dev │ main.ts │ main │ c5b1d1f │ │ 98crfqxa6vvf │ 5/3/2024 13:33:52 CET (9 days) │ Preview │ Production │ https://my-project-98crfqxa6vvf.deno.dev │ main.ts │ main │ 090146e │ │ xcdcs014yc5p │ 5/3/2024 13:30:58 CET (9 days) │ Preview │ Production │ https://my-project-xcdcs014yc5p.deno.dev │ main.ts │ main │ 5b78c0f │ │ btw43kx89ws1 │ 5/3/2024 13:27:31 CET (9 days) │ Preview │ Production │ https://my-project-btw43kx89ws1.deno.dev │ main.ts │ main │ 663452a │ │ 62tg1ketkjx7 │ 5/3/2024 13:27:03 CET (9 days) │ Preview │ Production │ https://my-project-62tg1ketkjx7.deno.dev │ main.ts │ main │ 24d1618 │ │ 07ag6pt6kjex │ 5/3/2024 13:19:11 CET (9 days) │ Preview │ Production │ https://my-project-07ag6pt6kjex.deno.dev │ main.ts │ main │ 4944545 │ │ 4msyne1rvwj1 │ 5/3/2024 13:17:16 CET (9 days) │ Preview │ Production │ https://my-project-4msyne1rvwj1.deno.dev │ main.ts │ main │ dda85e1 │ └───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘ Press enter to fetch the next page [Enter] ``` This command outputs pages of 20 deployments by default. You can iterate over the pages with the enter key, and use the `--page` and `--limit` options to query a specific page and page size. Like with the rest of commands, you can use the `--project` option to specify the project of which to list deployments, if you are not in a project directory or want to list deployments from a different project. ### Show Get all the details of a particular deployment using: ```shell deployctl deployments show ``` Output: ``` ✔ The production deployment of the project 'my-project' is 'c0ph5xa9exb3' ✔ The details of the deployment 'c0ph5xa9exb3' are ready: c0ph5xa9exb3 ------------ Status: Production Date: 2 days, 12 hours, 29 minutes, 46 seconds ago (12/3/2024 13:21:25 CET) Project: my-project (e54f23b5-828d-4b7f-af12-706d4591062b) Organization: my-team (d97822ac-ee20-4ce9-b942-5389330b57ee) Domain(s): https://my-project.deno.dev https://my-project-c0ph5xa9exb3.deno.dev Database: Production (0efa985f-3793-48bc-8c05-f740ffab4ca0) Entrypoint: main.ts Env Vars: HOME Git Ref: main [4b6c506] Message: change name Author: John Doe @johndoe [mailto:johndoe@deno.com] Url: https://github.com/arnauorriols/my-project/commit/4b6c50629ceeeb86601347732d01dc7ed63bf34f Crons: another cron [*/10 * * * *] succeeded at 15/3/2024 1:50:00 CET after 2 seconds (next at 15/3/2024 2:00:00 CET) newest cron [*/10 * * * *] n/a yet another cron [*/10 * * * *] failed at 15/3/2024 1:40:00 CET after 2 seconds (next at 15/3/2024 1:51:54 CET) ``` If no deployment is specified, the command shows the details of the current production deployment of the project. To see the details of the last deployment, use `--last`, and to see the details of a particular deployment, use `--id` (or positional argument). You can also use `--next` or `--prev` to navigate the deployments chronologically. For example, to see the details of the second to last deployment, you can do: ```shell deployctl deployments show --last --prev ``` And to see the details of 2 deployments after a specific deployment: ```shell deployctl deployments show 64tbrn8jre9n --next=2 ``` ### Redeploy The redeploy command creates a new deployment reusing the build of an existing deployment, for the purpose of changing the resources associated with it. This includes production domains, environment variables and KV databases. :::info The semantics of selecting the deployment to redeploy are the same as those of the [show subcommand](#show), including `--last`, `--id`, `--next` and `--prev`. ::: #### Production Domains If you want to change the routing of the production domains of the project to a particular deployment, you can redeploy it with the `--prod` option: ```shell deployctl deployments redeploy --prod 64tbrn8jre9n ``` This will create a new deployment with the same code and environment variables as the specified deployment, but with the production domains of the project pointing to it. For those projects with preview/prod databases (ie projects linked to GitHub), this will also set the production database for the new deployment. :::note This feature is similar to the "promote to production" button found in the Deno Deploy web application with the exception that the "promote to production" button does not create a new deployment. Instead, the "promote to production" button changes the domain routing in-place, however it's restricted to deployments already using the production database. ::: #### KV Database If this is a GitHub deployment, it will have 2 databases, one for prod deployments and one for preview deployments. You can change the database of a deployment by redeploying it with the `--db` option: ```shell deployctl deployments redeploy --db=prod --id=64tbrn8jre9n ``` :::note When redeploying a deployment to prod, by default it will automatically configure it to use the prod database. You can combine both `--prod` and `--db` options to opt out of this behavior. For example, the following command will redeploy the current production deployment (given the lack of positional argument, `--id` or `--last`). The new deployment will become the new production deployment, but it will use the preview database instead of the production database: ```shell deployctl deployments redeploy --prod --db=preview ``` ::: If your organization has custom databases, you can also set them by UUID: ```shell deployctl deployments redeploy --last --db=5261e096-f9aa-4b72-8440-1c2b5b553def ``` #### Environment Variables When a deployment is created, it inherits the environment variables of the project. Given that the deployments are immutable, their environment variables can never be changed. To set new environment variables in a deployment, you need to redeploy it using `--env` (to set individual variables) and `--env-file` (to load one or more environment files). The following command redeploys the current production deployment with the env variables defined in the `.env` and `.other-env` files, plus the `DEPLOYMENT_TS` variable set to the current timestamp. The resulting deployment will be a preview deployment (ie the production domains won't route traffic to it, given the lack of `--prod`). ```shell deployctl deployments redeploy --env-file --env-file=.other-env --env=DEPLOYMENT_TS=$(date +%s) ``` :::note Be aware that when changing env variables, only the env variables set in the redeploy command will be used by the new deployment. The project env variables and the env variables of the deployment being redeployed are ignored. If this does not suit your needs, please report your feedback at https://github.com/denoland/deploy_feedback/issues/ ::: :::note When you change the project environment variables in the Deno Deploy web application, the current production deployment is redeployed with the new environment variables, and the new deployment becomes the new production deployment. ::: ### Delete You can delete a deployment using the `delete` subcommand: ```shell deployctl deployments delete 64tbrn8jre9n ``` Like `show` and `redeploy`, `delete` can also use `--last`, `--next` and `--prev` to select the deployment to delete. Here's an example command that deletes all the deployments of a project except the last (use with caution!): ```shell while deployctl deployments delete --project=my-project --last --prev; do :; done ``` ## Projects The `projects` subcommand groups all the operations against projects as a whole. this includes `list`, `show`, `rename`, `create` and `delete`. ### List `deployctl projects list` outputs all the projects your user has access to, grouped by organization: ``` Personal org: blog url-shortener 'my-team' org: admin-site main-site analytics ``` You can filter by organization using `--org`: ```shell deployctl projects list --org=my-team ``` ### Show To see the details of a particular project, use `projects show`. If you are inside a project, it will pick up the project id from the config file. You can also specify the project using `--project` or the positional argument: ```shell deployctl projects show main-site ``` Output: ``` main-site --------- Organization: my-team (5261e096-f9aa-4b72-8440-1c2b5b553def) Domain(s): https://my-team.com https://main-site.deno.dev Dash URL: https://dash.deno.com/projects/8422c515-f68f-49b2-89f3-157f4b144611 Repository: https://github.com/my-team/main-site Databases: [main] dd28e63e-f495-416b-909a-183380e3a232 [*] e061c76e-4445-409a-bc36-a1a9040c83b3 Crons: another cron [*/10 * * * *] succeeded at 12/3/2024 14:40:00 CET after 2 seconds (next at 12/3/2024 14:50:00 CET) newest cron [*/10 * * * *] n/a yet another cron [*/10 * * * *] failed at 12/3/2024 14:40:00 CET after 2 seconds (next at 12/3/2024 14:50:00 CET) Deployments: kcbxc4xwe4mc c0ph5xa9exb3* kwkbev9er4h2 dxseq0jc8402 7xr5thz8yjbz 4qr4h5ac3rfn 25wryhcqmb9q 64tbrn8jre9n hgqgccnmzg04 rxkh1w3g74e8 wx6cw9aya64c a1qh5fmew2yf w6pf4r0rrdkb nn700gexgdzq 98crfqxa6vvf xcdcs014yc5p btw43kx89ws1 62tg1ketkjx7 07ag6pt6kjex 4msyne1rvwj1 ``` ### Rename Projects can be renamed easily with the `rename` subcommand. Similarly to the other commands, if you run the command from within a project's directory, you don't need to specify the current name of the project: ```shell deployctl projects rename my-personal-blog ``` Output: ``` ℹ Using config file '/private/tmp/blog/deno.json' ✔ Project 'blog' (8422c515-f68f-49b2-89f3-157f4b144611) found ✔ Project 'blog' renamed to 'my-personal-blog' ``` :::note Keep in mind that the name of the project is part of the preview domains (https://my-personal-blog-kcbxc4xwe4mc.deno.dev) and the default production domain (https://my-personal-blog.deno.dev). Therefore, when changing the project name, the URLs with the previous name will no longer route to the project's corresponding deployments. ::: ### Create You can create an empty project with: ```shell deployctl projects create my-new-project ``` ### Delete You can delete a project with: ```shell deployctl projects delete my-new-project ``` ## Top The `top` subcommand is used to monitor the resource usage of a project in real-time: ```shell deployctl top ``` Output: ``` ┌────────┬────────────────┬────────────────────────┬─────────┬───────┬─────────┬──────────┬─────────────┬────────────┬─────────┬─────────┬───────────┬───────────┐ │ (idx) │ deployment │ region │ Req/min │ CPU% │ CPU/req │ RSS/5min │ Ingress/min │ Egress/min │ KVr/min │ KVw/min │ QSenq/min │ QSdeq/min │ ├────────┼────────────────┼────────────────────────┼─────────┼───────┼─────────┼──────────┼─────────────┼────────────┼─────────┼─────────┼───────────┼───────────┤ │ 6b80e8 │ "kcbxc4xwe4mc" │ "asia-northeast1" │ 80 │ 0.61 │ 4.56 │ 165.908 │ 11.657 │ 490.847 │ 0 │ 0 │ 0 │ 0 │ │ 08312f │ "kcbxc4xwe4mc" │ "asia-northeast1" │ 76 │ 3.49 │ 27.58 │ 186.278 │ 19.041 │ 3195.288 │ 0 │ 0 │ 0 │ 0 │ │ 77c10b │ "kcbxc4xwe4mc" │ "asia-south1" │ 28 │ 0.13 │ 2.86 │ 166.806 │ 7.354 │ 111.478 │ 0 │ 0 │ 0 │ 0 │ │ 15e356 │ "kcbxc4xwe4mc" │ "asia-south1" │ 66 │ 0.97 │ 8.93 │ 162.288 │ 17.56 │ 4538.371 │ 0 │ 0 │ 0 │ 0 │ │ a06817 │ "kcbxc4xwe4mc" │ "asia-southeast1" │ 126 │ 0.44 │ 2.11 │ 140.087 │ 16.504 │ 968.794 │ 0 │ 0 │ 0 │ 0 │ │ d012b6 │ "kcbxc4xwe4mc" │ "asia-southeast1" │ 119 │ 2.32 │ 11.72 │ 193.704 │ 23.44 │ 8359.829 │ 0 │ 0 │ 0 │ 0 │ │ 7d9a3d │ "kcbxc4xwe4mc" │ "australia-southeast1" │ 8 │ 0.97 │ 75 │ 158.872 │ 10.538 │ 3.027 │ 0 │ 0 │ 0 │ 0 │ │ 3c21be │ "kcbxc4xwe4mc" │ "australia-southeast1" │ 1 │ 0.04 │ 90 │ 105.292 │ 0.08 │ 1.642 │ 0 │ 0 │ 0 │ 0 │ │ b75dc7 │ "kcbxc4xwe4mc" │ "europe-west2" │ 461 │ 5.43 │ 7.08 │ 200.573 │ 63.842 │ 9832.936 │ 0 │ 0 │ 0 │ 0 │ │ 33607e │ "kcbxc4xwe4mc" │ "europe-west2" │ 35 │ 0.21 │ 3.69 │ 141.98 │ 9.438 │ 275.788 │ 0 │ 0 │ 0 │ 0 │ │ 9be3d2 │ "kcbxc4xwe4mc" │ "europe-west2" │ 132 │ 0.92 │ 4.19 │ 180.654 │ 15.959 │ 820.513 │ 0 │ 0 │ 0 │ 0 │ │ 33a859 │ "kcbxc4xwe4mc" │ "europe-west3" │ 1335 │ 7.57 │ 3.4 │ 172.032 │ 178.064 │ 10967.918 │ 0 │ 0 │ 0 │ 0 │ │ 3f54ce │ "kcbxc4xwe4mc" │ "europe-west4" │ 683 │ 4.76 │ 4.19 │ 187.802 │ 74.696 │ 7565.017 │ 0 │ 0 │ 0 │ 0 │ │ cf881c │ "kcbxc4xwe4mc" │ "europe-west4" │ 743 │ 3.95 │ 3.19 │ 177.213 │ 86.974 │ 6087.454 │ 0 │ 0 │ 0 │ 0 │ │ b4565b │ "kcbxc4xwe4mc" │ "me-west1" │ 3 │ 0.21 │ 55 │ 155.46 │ 2.181 │ 0.622 │ 0 │ 0 │ 0 │ 0 │ │ b97970 │ "kcbxc4xwe4mc" │ "southamerica-east1" │ 3 │ 0.08 │ 25 │ 186.049 │ 1.938 │ 0.555 │ 0 │ 0 │ 0 │ 0 │ │ fd7a08 │ "kcbxc4xwe4mc" │ "us-east4" │ 3 │ 0.32 │ 80 │ 201.101 │ 0.975 │ 58.495 │ 0 │ 0 │ 0 │ 0 │ │ 95d68a │ "kcbxc4xwe4mc" │ "us-east4" │ 133 │ 1.05 │ 4.77 │ 166.052 │ 28.107 │ 651.737 │ 0 │ 0 │ 0 │ 0 │ │ c473e7 │ "kcbxc4xwe4mc" │ "us-east4" │ 0 │ 0 │ 0 │ 174.154 │ 0.021 │ 0 │ 0 │ 0 │ 0 │ 0 │ │ ebabfb │ "kcbxc4xwe4mc" │ "us-east4" │ 19 │ 0.15 │ 4.78 │ 115.732 │ 7.764 │ 67.054 │ 0 │ 0 │ 0 │ 0 │ │ eac700 │ "kcbxc4xwe4mc" │ "us-south1" │ 114 │ 2.37 │ 12.54 │ 183.001 │ 18.401 │ 22417.397 │ 0 │ 0 │ 0 │ 0 │ │ cd2194 │ "kcbxc4xwe4mc" │ "us-south1" │ 35 │ 0.33 │ 5.68 │ 145.871 │ 8.142 │ 91.236 │ 0 │ 0 │ 0 │ 0 │ │ 140fec │ "kcbxc4xwe4mc" │ "us-west2" │ 110 │ 1.43 │ 7.84 │ 115.298 │ 18.093 │ 977.993 │ 0 │ 0 │ 0 │ 0 │ │ 51689f │ "kcbxc4xwe4mc" │ "us-west2" │ 1105 │ 7.66 │ 4.16 │ 187.277 │ 154.876 │ 14648.383 │ 0 │ 0 │ 0 │ 0 │ │ c5806e │ "kcbxc4xwe4mc" │ "us-west2" │ 620 │ 4.38 │ 4.24 │ 192.291 │ 109.086 │ 9685.688 │ 0 │ 0 │ 0 │ 0 │ └────────┴────────────────┴────────────────────────┴─────────┴───────┴─────────┴──────────┴─────────────┴────────────┴─────────┴─────────┴───────────┴───────────┘ ⠼ Streaming... ``` The columns are defined as follows: | Column | Description | | ----------- | -------------------------------------------------------------------------------------------------- | | idx | Instance discriminator. Opaque id to discriminate different executions running in the same region. | | deployment | The id of the deployment running in the executing instance. | | Req/min | Requests per minute received by the project. | | CPU% | Percentage of CPU used by the project. | | CPU/req | CPU time per request, in milliseconds. | | RSS/5min | Max RSS used by the project during the last 5 minutes, in MB. | | Ingress/min | Data received by the project per minute, in KB. | | Egress/min | Data output by the project per minute, in KB. | | KVr/min | KV reads performed by the project per minute. | | KVw/min | KV writes performed by the project per minute. | | QSenq/min | Queues enqueues performed by the project per minute. | | QSdeq/min | Queues dequeues performed by the project per minute. | You can filter by region using `--region`, which accepts substrings and can be used multiple times: ```shell deployctl top --region=asia --region=southamerica ``` ## Logs You can fetch the logs of your deployments with `deployctl logs`. It supports both live logs where the logs are streamed to the console as they are generated, and query persisted logs where the logs generated in the past are fetched. To show the live logs of the current production deployment of a project: ```shell deployctl logs ``` :::note Unlike in the Deno Deploy web application, at the moment the logs subcommand does not automatically switch to the new production deployment when it changes. ::: To show the live logs of a particular deployment: ```shell deployctl logs --deployment=1234567890ab ``` Logs can be filtered by level, region and text using `--levels` `--regions` and `--grep` options: ```shell deployctl logs --levels=error,info --regions=region1,region2 --grep='unexpected' ``` To show the persisted logs, use the `--since` and/or `--until` options: ```sh deployctl logs --since=$(date -Iseconds -v-2H) --until=$(date -Iseconds -v-30M) ``` ```sh deployctl logs --since=$(date -Iseconds --date='2 hours ago') --until=$(date -Iseconds --date='30 minutes ago') ``` ## API If you use the [subhosting API](../../subhosting/manual/index.md), `deployctl api` will help you interact with the API by handling the authentication and headers for you: ```shell deployctl api /projects/my-personal-blog/deployments ``` Use `--method` and `--body` to specify the HTTP method and the request body: ```shell deployctl api --method=POST --body='{"name": "main-site"}' organizations/5261e096-f9aa-4b72-8440-1c2b5b553def/projects ``` ## Local Development For local development you can use the `deno` CLI. To install `deno`, follow the instructions in the [Deno manual](https://deno.land/manual/getting_started/installation). After installation, you can run your scripts locally: ```shell $ deno run --allow-net=:8000 ./main.ts Listening on http://localhost:8000 ``` To watch for file changes add the `--watch` flag: ```shell $ deno run --allow-net=:8000 --watch ./main.ts Listening on http://localhost:8000 ``` For more information about the Deno CLI, and how to configure your development environment and IDE, visit the Deno Manual's [Getting Started][manual-gs] section. [manual-gs]: https://deno.land/manual/getting_started ## JSON output All the commands that output data have a `--format=json` option that outputs the data in JSON objects. This output mode is the default when stdout is not a TTY, notably when piping to another command. Together with `jq`, this mode enables the programmatic use of all the data provided by `deployctl`: Get the id of the current production deployment: ```shell deployctl deployments show | jq .build.deploymentId ``` Get a csv stream of the CPU time per request on each isolate of each region: ```shell deployctl top | jq -r '[.id,.region,.cpuTimePerRequest] | @csv' ``` --- # Deployments URL: https://docs.deno.com/deploy/manual/deployments A deployment is a snapshot of the code and environment variables required to run an application. A new deployment can be created [via `deployctl`](./deployctl.md#deploy) or automatically via Deploy's Github integration if configured. Deployments are immutable after they have been created. To deploy a new version of the code for an application, a new deployment must be created. Once created, deployments remain accessible. All available deployments are listed on your project page under the `Deployments` tab, pictured below. Old deployments can be deleted [via `deployctl`](./deployctl.md#delete) and [via API](https://apidocs.deno.com/#delete-/deployments/-deploymentId-). ![showing the deployments tab in the project dashboard](./images/project_deployments.png) ## Custom domains There can also be other URLs that can point to a deployment, like [custom domains](custom-domains). ## Branch domains `.deno.dev` is also supported. ## Production vs. preview deployments All deployments have a preview URL that can be used to view this specific deployment. Preview URLs have the format `{project_name}-{deployment_id}.deno.dev`. ![image](../docs-images/preview_deployment.png) A deployment can either be a production or a preview deployment. These deployments do not have any differences in runtime functionality. The only distinguishing factor is that a project's production deployment will receive traffic from the project URL (e.g. `myproject.deno.dev`), and from custom domains in addition to traffic to the deployment's preview URL. ## Promoting preview deployments to production deployments via Deno Deploy UI Preview deployments can be "promoted" to production via the Deno Deploy UI: 1. Navigate to the project page. 2. Click on the **Deployments** tab. 3. Click on the three dots next to the deployment you want to promote to production and select **Promote to Production** ![promote_to_production](../docs-images/promote_to_production.png) Promoting deployments to production is restricted to deployments that already use the production KV database. This is particularly relevant for GitHub deployments that use a different database for preview and production deployments. Deployments (even those that use the preview KV database) can always be redeployed to production using [the `deployctl deployments redeploy` command](./deployctl.md#production-domains). ## Creating production deployments via `deployctl` If you are deploying your Deno code with `deployctl`, you can deploy directly to production with the `--prod` flag: ```sh deployctl deploy --prod --project=helloworld main.ts ``` --- # Connect to DynamoDB URL: https://docs.deno.com/deploy/manual/dynamodb Amazon DynamoDB is a fully managed NoSQL database. To persist data to DynamoDB, follow the steps below: The tutorial assumes that you have an AWS and Deno Deploy account. You can find a more comprehensive tutorial that builds a sample application on top of DynamoDB [here](../tutorials/tutorial-dynamodb). ## Gather credentials from DynamoDB The first step in the process is to generate AWS credentials to programmatically access DynamoDB. Generate Credentials: 1. Go to https://console.aws.amazon.com/iam/ and go to the "Users" section. 2. Click on the **Add user** button, fill the **User name** field (maybe use `denamo`), and select **Programmatic access** type. 3. Click on **Next: Permissions**, then on **Attach existing policies directly**, search for `AmazonDynamoDBFullAccess` and select it. 4. Click on **Next: Tags**, then on **Next: Review** and finally **Create user**. 5. Click on **Download .csv** button to download the credentials. ## Create a project in Deno Deploy Next, let's create a project in Deno Deploy and set it up with the requisite environment variables: 1. Go to [https://dash.deno.com/new](https://dash.deno.com/new) (Sign in with GitHub if you didn't already) and click on **+ Empty Project** under **Deploy from the command line**. 2. Now click on the **Settings** button available on the project page. 3. Navigate to **Environment Variables** Section and add the following secrets. - `AWS_ACCESS_KEY_ID` - Use the value that's available under **Access key ID** column in the downloaded CSV. - `AWS_SECRET_ACCESS_KEY` - Use the value that's available under **Secret access key** column in the downloaded CSV. ## Write code that connects to DynamoDB AWS has an [official SDK](https://www.npmjs.com/package/@aws-sdk/client-dynamodb) that works with browsers. As most Deno Deploy's APIs are similar to browsers', the same SDK works with Deno Deploy. To use the SDK in Deno, import from a cdn like below and create a client: ```js import { DynamoDBClient, GetItemCommand, PutItemCommand, } from "https://esm.sh/@aws-sdk/client-dynamodb?dts"; // Create a client instance by providing your region information. // The credentials are automatically obtained from environment variables which // we set during our project creation step on Deno Deploy, so we don't have to // pass them manually here. const client = new ApiFactory().makeNew(DynamoDB); serve({ "/songs": handleRequest, }); async function handleRequest(request) { // async/await. try { const data = await client.send(command); // process data. } catch (error) { // error handling. } finally { // finally. } } ``` ## Deploy application to Deno Deploy Once you have finished writing your application, you can deploy it on Deno Deploy. To do this, go back to your project page at `https://dash.deno.com/projects/`. You should see a couple of options to deploy: - [Github integration](ci_github) - [`deployctl`](./deployctl.md) ```sh deployctl deploy --project= ``` Unless you want to add a build step, we recommend that you select the Github integration. For more details on the different ways to deploy on Deno Deploy and the different configuration options, read [here](how-to-deploy). --- # Edge Cache URL: https://docs.deno.com/deploy/manual/edge-cache The [Web Cache API](https://developer.mozilla.org/en-US/docs/Web/API/Cache) is supported on Deno Deploy. The cache is designed to provide microsecond-level read latency, multi-GB/s write throughput and unbounded storage, with the tradeoff of best-effort consistency and durability. ```ts const cache = await caches.open("my-cache"); Deno.serve(async (req) => { const cached = await cache.match(req); if (cached) { return cached; } const res = new Response("cached at " + new Date().toISOString()); await cache.put(req, res.clone()); return res; }); ``` Cached data is stored in the same Deno Deploy region that runs your code. Usually your isolate observes read-after-write (RAW) and write-after-write (WAW) consistency within the same region; however, in rare cases recent writes can be lost, out-of-order or temporarily invisible. ## Expiration By default, cached data is persisted for an indefinite period of time. While we periodically scan and delete inactive objects, an object is usually kept in cache for at least 30 days. Edge Cache understands standard HTTP response headers `Expires` and `Cache-Control`. You can use them to specify an expiration time for every cached object, for example: ``` Expires: Thu, 22 Aug 2024 01:22:31 GMT ``` or: ``` Cache-Control: max-age=86400 ``` ## Limitations - If a response is not constructed from a `Uint8Array` or `string` body, the `Content-Length` header needs to be manually set. - Deletion is not yet supported. --- # Environment variables URL: https://docs.deno.com/deploy/manual/environment-variables Environment variables are useful to store values like access tokens of web services. Each deployment has a set of environment variables defined at the moment of creation and accessible from the code via the `Deno.env` API. There are 2 ways to define the environment variables of a deployment: ## Project environment variables You can define environment variables at the project level. When you create a deployment, it will get the set of environment variables the project has defined _at that particular moment_. For convenience, When you change the environment variables of a project, the current production deployment is _redeployed_, creating a new production deployment with the new set of environment variables. :::note Deployments are immutable, including their environment variables. Changing the environment variables of a project does not change the environment variables of existing deployments. ::: To add an environment variable to your project, click on the **Settings** button on the project page and then on **Environment Variables** from the sidebar. Fill in the key/value fields and click on "Add" to add an environment variable to your project. ![environment_variable](../docs-images/fauna2.png) Updating an existing environment variable works the same way. Click on the "Add Variable" button, enter the same name of the environment variable you wish to update and enter the new value. Click on the "Save" button to complete the update. ## Deployment environment variables When deploying using `deployctl`, you can specify environment variables [using the `--env` or `--env-file` flags](./deployctl.md#environment-variables), complementing the environment variables already defined for the project. You can also pass multiple `--env-file` arguments (e.g., `--env-file=.env.one --env-file=.env.two`) to include variables from multiple files. :::note When multiple declarations for the same environment variable exist within a single `.env` file, the first occurrence is applied. However, if the same variable is defined across multiple `.env` files (using multiple `--env-file` arguments), the value from the last file specified takes precedence. This means that the first occurrence found in the last `.env` file listed will be applied. ::: These env variables will be specific for the deployment being created. ### Default environment variables Every deployment has the following environment variables preset, which you can access from your code. 1. `DENO_REGION` It holds the region code of the region in which the deployment is running. You can use this variable to serve region-specific content. You can refer to the region code from the [regions page](regions). 1. `DENO_DEPLOYMENT_ID` It holds the ID of the deployment. --- # Connect to FaunaDB URL: https://docs.deno.com/deploy/manual/faunadb FaunaDB calls itself "the data API for modern applications." It's a database with a GraphQL interface that enables you to use GraphQL to interact with it. Since you communicate with it using HTTP requests, you don't need to manage connections, which works well for serverless applications. This tutorial covers how to connect to a Fauna database from an application deployed on Deno Deploy. You can find a more comprehensive tutorial that builds a sample application on top of Fauna [here](../tutorials/tutorial-faunadb). ## Get credentials from Fauna We assume that you've already created a Fauna instance at https://dashboard.fauna.com. To access your Fauna database programmatically, you'll need to generate a credential: 1. Click on **Security** section inside your particular database and click on **New Key**. ![fauna1](../docs-images/fauna1.png) 2. Select **Server** role and click on **Save**. Copy the secret. You'll need it for the next step. ## Create a project in Deno Deploy Next, let's create a project on Deno Deploy and set it up with the requisite environment variables: 1. Go to [https://dash.deno.com/new](https://dash.deno.com/new) (Sign in with GitHub if you didn't already) and click on **+ Empty Project** under **Deploy from the command line**. 2. Now click on the **Settings** button available on the project page. 3. Navigate to the **Environment Variables** section and add the following secrets. - `FAUNA_SECRET` - The value should be the secret we created in the previous step. ![fauna2](../docs-images/fauna2.png) ## Write code that connects to Fauna While with Node there is a Fauna JavaScript driver, with Deno, you should use the graphql endpoint. Fauna has a graphql endpoint for its database, and it generates essential mutations like `create`, `update`, `delete` for a data type defined in the schema. For example, Fauna will generate a mutation named `createQuote` to create a new quote in the database for the data type `Quote`. To interact with Fauna, we need to make a POST request to its graphql endpoint with appropriate query and parameters to get the data in return. So let's construct a generic function that will handle those things. ```javascript import query from "https://esm.sh/faunadb@4.7.1"; import Client from "https://esm.sh/faunadb@4.7.1"; // Grab the secret from the environment. const token = Deno.env.get("FAUNA_SECRET"); if (!token) { throw new Error("environment variable FAUNA_SECRET not set"); } var client = new Client.Client({ secret: token, // Adjust the endpoint if you are using Region Groups endpoint: "https://db.fauna.com/", }); // HEAD client.query(query.ToDate("2018-06-06")); // client .query(query.ToDate("2018-06-06")) //1e2f378 (Add some more pages) .then(function (res) { console.log("Result:", res); }) .catch(function (err) { console.log("Error:", err); }); ``` ## Deploy application to Deno Deploy Once you have finished writing your application, you can deploy it on Deno Deploy. To do this, go back to your project page at `https://dash.deno.com/projects/`. You should see a couple options to deploy: - [Github integration](ci_github) - [`deployctl`](./deployctl.md) ```sh deployctl deploy --project= ``` Unless you want to add a build step, we recommend that you select the Github integration. For more details on the different ways to deploy on Deno Deploy and the different configuration options, read [here](how-to-deploy). --- # Connect to Firebase URL: https://docs.deno.com/deploy/manual/firebase Firebase is a platform developed by Google for creating mobile and web applications. Its features include authentication primitives for log in and a NoSQL datastore, Firestore, that you can persist data to. This tutorial covers how to connect to Firebase from an application deployed on Deno Deploy. You can find a more comprehensive tutorial that builds a sample application on top of Firebase [here](../tutorials/tutorial-firebase). ## Get credentials from Firebase > This tutorial assumes that you've already created a project in Firebase and > added a web application to your project. 1. Navigate to your project in Firebase and click on **Project Settings** 2. Scroll down until you see a card with your app name, and a code sample that includes a `firebaseConfig`object. It should look something like the below. Keep this handy. We will use it later: ```js var firebaseConfig = { apiKey: "APIKEY", authDomain: "example-12345.firebaseapp.com", projectId: "example-12345", storageBucket: "example-12345.appspot.com", messagingSenderId: "1234567890", appId: "APPID", }; ``` ## Create a Project in Deno Deploy 1. Go to [https://dash.deno.com/new](https://dash.deno.com/new) (Sign in with GitHub if you didn't already) and click on **+ Empty Project** under **Deploy from the command line**. 2. Now click on the **Settings** button available on the project page. 3. Navigate to the **Environment Variables** section and add the following:
FIREBASE_USERNAME
The Firebase user (email address) that was added above.
FIREBASE_PASSWORD
The Firebase user password that was added above.
FIREBASE_CONFIG
The configuration of the Firebase application as a JSON string.
The configuration needs to be a valid JSON string to be readable by the application. If the code snippet given when setting up looked like this: ```js var firebaseConfig = { apiKey: "APIKEY", authDomain: "example-12345.firebaseapp.com", projectId: "example-12345", storageBucket: "example-12345.appspot.com", messagingSenderId: "1234567890", appId: "APPID", }; ``` You would need to set the value of the string to this (noting that spacing and new lines are not required): ```json { "apiKey": "APIKEY", "authDomain": "example-12345.firebaseapp.com", "projectId": "example-12345", "storageBucket": "example-12345.appspot.com", "messagingSenderId": "1234567890", "appId": "APPID" } ``` ## Write code that connects to Firebase The first thing we will do is import the `XMLHttpRequest` polyfill that Firebase needs to work under Deploy as well as a polyfill for `localStorage` to allow the Firebase auth to persist logged in users: ```js import "https://deno.land/x/xhr@0.1.1/mod.ts"; import { installGlobals } from "https://deno.land/x/virtualstorage@0.1.0/mod.ts"; installGlobals(); ``` > ℹ️ we are using the current version of packages at the time of the writing of > this tutorial. They may not be up-to-date and you may want to double check > current versions. Because Deploy has a lot of the web standard APIs, it is best to use the web libraries for Firebase under deploy. Currently v9 is in still in beta for Firebase, so we will use v8: ```js import firebase from "https://esm.sh/firebase@9.17.0/app"; import "https://esm.sh/firebase@9.17.0/auth"; import "https://esm.sh/firebase@9.17.0/firestore"; ``` Now we need to setup our Firebase application. We will be getting the configuration from the environment variables we set up previously and get references to the parts of Firebase we are going to use: ```js const firebaseConfig = JSON.parse(Deno.env.get("FIREBASE_CONFIG")); const firebaseApp = firebase.initializeApp(firebaseConfig, "example"); const auth = firebase.auth(firebaseApp); const db = firebase.firestore(firebaseApp); ``` Ok, we are almost done. We just need to create our middleware application and add the `localStorage` middleware we imported: ```js const app = new Application(); app.use(virtualStorage()); ``` And then we need to add middleware to authenticate the user. In this tutorial we are simply grabbing the username and password from the environment variables we will be setting up, but this could easily be adapted to redirect a user to a sign-in page if they are not logged in: ```js app.use(async (ctx, next) => { const signedInUid = ctx.cookies.get("LOGGED_IN_UID"); const signedInUser = signedInUid != null ? users.get(signedInUid) : undefined; if (!signedInUid || !signedInUser || !auth.currentUser) { const creds = await auth.signInWithEmailAndPassword( Deno.env.get("FIREBASE_USERNAME"), Deno.env.get("FIREBASE_PASSWORD"), ); const { user } = creds; if (user) { users.set(user.uid, user); ctx.cookies.set("LOGGED_IN_UID", user.uid); } else if (signedInUser && signedInUid.uid !== auth.currentUser?.uid) { await auth.updateCurrentUser(signedInUser); } } return next(); }); ``` ## Deploy the application to Deno Deploy Once you have finished writing your application, you can deploy it on Deno Deploy. To do this, go back to your project page at `https://dash.deno.com/projects/`. You should see a couple of options to deploy: - [Github integration](ci_github) - [`deployctl`](./deployctl.md) ```sh deployctl deploy --project= ``` Unless you want to add a build step, we recommend that you select the Github integration. For more details on the different ways to deploy on Deno Deploy and the different configuration options, read [here](how-to-deploy). --- # Fulfillment Policy URL: https://docs.deno.com/deploy/manual/fulfillment-policy ## Refund Policy At Deno Deploy, we strive to provide exceptional service. If you are not satisfied with our service, you may request a refund under the following conditions: A refund must be requested within 14 days of the initial purchase or upgrade of any subscription plan. Refunds may be considered if the service fails to function correctly and if the issue cannot be resolved by our support team within a reasonable time frame. No refunds will be issued for services used in violation of our terms of service or for problems clearly attributable to user error or external platform changes. Recurring subscriptions may be canceled but are only eligible for a refund for the initial billing cycle if requested within the 14-day period. ## Cancellation Policy You can cancel your Deno Deploy subscription at any time under the following terms: Subscription cancellations are effective immediately, and the service will continue to run until the end of the current billing period. To cancel your subscription, please navigate to your account settings on the Deno Deploy dashboard and select 'Cancel Subscription'. Once the subscription is canceled, no further charges will be incurred, but you are responsible for any charges accrued before the effective date of cancellation. Contact Us For more information about our fulfillment policies, or if you require assistance, please contact our support team at [deploy@deno.com](mailto:deploy@deno.com). --- # Deploy with GitHub integration URL: https://docs.deno.com/deploy/manual/how-to-deploy The simplest way to deploy more complex projects is via our Github integration. This allows you to link a Deno Deploy project to a GitHub repository. Every time you push to the repository, your changes will be automatically deployed. Via the Github integration, you can add a Github Action that defines a build step in your deployment process. See [the Github integration page](ci_github) for more details. ### Deploy from command line with [`deployctl`](./deployctl.md) `deployctl` is a command line tool for deploying your code to Deno Deploy. You can control more details of your deployment than the above automatic GitHub integration by using `deployctl`. See [the `deployctl` page](./deployctl.md) for more details. ### Deploy with playground The easiest way to deploy some code is via a Deno Deploy playground. See the [playground page](playgrounds) for more details. --- # Deploy Quick Start URL: https://docs.deno.com/deploy/manual/ Deno Deploy is a globally distributed platform for serverless JavaScript applications. Your JavaScript, TypeScript, and WebAssembly code runs on managed servers geographically close to your users, enabling low latency and faster response times. Deploy applications run on fast, light-weight [V8 isolates](https://deno.com/blog/anatomy-isolate-cloud) rather than virtual machines, powered by the [Deno runtime](/runtime/manual). Let's deploy your first application - it should only take a few minutes. ## Install Deno and `deployctl` If you haven't already, you can [install the Deno runtime](/runtime/getting_started/installation) using one of the commands below: ```sh curl -fsSL https://deno.land/install.sh | sh ``` ```powershell irm https://deno.land/install.ps1 | iex ``` ```sh curl -fsSL https://deno.land/install.sh | sh ``` After Deno is installed, install the [`deployctl`](./deployctl.md) utility: ``` deno install -A jsr:@deno/deployctl --global ``` You can confirm `deployctl` has been installed correctly by running: ```console deployctl --help ``` Now, you're ready to deploy a Deno script from the command line! ## Write and test a Deno program First, create a directory for the project and create a file called `main.ts` in it, with the following "Hello World" web server: ```ts title="main.ts" Deno.serve(() => new Response("Hello, world!")); ``` You can test that it works by running it with the command below: ``` deno run --allow-net main.ts ``` Your server should be viewable at [localhost:8000](http://localhost:8000). Now let's run this code on the edge with Deno Deploy! ## Deploy your project From the directory of the `main.ts` file you just created, run this command: ```sh deployctl deploy ``` You will be asked to authorize Deno Deploy in GitHub to sign up to Deno Deploy and/or to provision an access token for `deployctl`. A few moments after that, your Hello World server will be deployed in Deno Deploy infrastructure all around the world, ready to handle all the traffic you expect. ## Next Steps Now that you've created your first deployment, you can [learn what kinds of apps](./use-cases.md) you can run on Deno Deploy, check out [what else you can do with deployctl](./deployctl.md), or keep reading to find out what other options you have to deploy your code to Deno Deploy. We're so excited to see what you'll ship with Deno Deploy! ### Deploy your existing project Import a project and run it on the edge with Deno Deploy. 1. [From the Deno Deploy dashboard](https://dash.deno.com) click the "New Project" button. 2. Connect to your GitHub account and select the repository you would like to deploy. 3. Follow the on-screen instructions to deploy your existing application. If your project requires a build step, use the Project Configuration form to create a GitHub action to deploy your project. Give your project a name and select from the optional framework presets. If you are not using a framework, you can set up your build settings using the form. 4. Confirm that your build options are correct and click the "Deploy Project" button to kick off your new Github action and deploy your project. In a few moments, your project will be deployed across ~12 data centers around the world, ready to handle large volumes of traffic. Once your deployment is successful you can visit your newly deployed project at the url provided on the success page or manage it in your dashboard. ### Start with a playground A [playground](./playgrounds.md) is a browser-based editor that enables you to write and run JavaScript or TypeScript code right away This is a great choice for just kicking the tires on Deno and Deno Deploy! From the [Deno Deploy dashboard](https://dash.deno.com), click the "New Playground" button to create a playground. We also have a variety of ready built tutorials for you to try out Deno Deploy try them out by clicking on "Learning Playground" or visiting:\ [Simple HTTP server playground](https://dash.deno.com/tutorial/tutorial-http)\ [Using the Deno KV database playground](https://dash.deno.com/tutorial/tutorial-http-kv)\ [RESTful API server playground](https://dash.deno.com/tutorial/tutorial-restful)\ [Realtime app with WebSockets playground](https://dash.deno.com/tutorial/tutorial-websocket)\ [Recurring tasks with Deno.cron playground](https://dash.deno.com/tutorial/tutorial-cron) --- # Application logging URL: https://docs.deno.com/deploy/manual/logs Applications can generate logs at runtime using the console API, with methods such as `console.log`, `console.error`, etc. These logs can be viewed in real time by either: - Navigating to the `Logs` panel of a project or deployment. - Using the `logs` subcommand in [deployctl](https://docs.deno.com/deploy/manual/deployctl). Logs will be streamed directly from the application to the log panel or displayed in `deployctl logs`. In addition to real-time logs, logs are also retained for a certain duration, which depends on the subscription plan you are on. To view persisted logs, you can: - If you are using the log panel in your browser, switch from `Live` to either `Recent` or `Custom` in the dropdown menu next to the search box. - If you prefer the command line, add `--since=` and/or `--until=` to your `deployctl logs` command. For more details, consult `deployctl logs --help`. Logs older than the retention period are automatically deleted from the system. ## Limits There are limits on both the size of a log message and the volume of logs produced in a certain amount of time. Log messages have a maximum size of 2KB. Messages larger than this limit are trimmed to 2KB. A deployment is allowed to produce up to 1000 log entries per second. If it is exceeded, we may terminate the deployment. --- # Reverse proxy middleware URL: https://docs.deno.com/deploy/manual/middleware This quickstart will cover how to deploy a small piece of middleware that reverse proxies another server (in this case example.com). For additional examples of common middleware functions, see the [example gallery](../tutorials/index.md). ## **Step 1:** Create a new playground project on Deno Deploy Navigate to https://dash.deno.com/projects and click on the "New Playground" button. ## **Step 2:** Deploy middleware code via playground On the next page, copy and paste the code below into the editor. It is an HTTP server that proxies all requests to https://example.com. ```ts async function reqHandler(req: Request) { const reqPath = new URL(req.url).pathname; return await fetch("https://example.com" + reqPath, { headers: req.headers }); } Deno.serve(reqHandler); ``` Click **Save and Deploy**. You should see something like this: ![image](../docs-images/proxy_to_example.png) --- # Connect to Neon Postgres URL: https://docs.deno.com/deploy/manual/neon-postgres This tutorial covers how to connect to a Neon Postgres database from an application deployed on Deno Deploy. You can find a more comprehensive tutorial that builds a sample application on top of Postgres [here](../tutorials/tutorial-postgres). ## Setup Postgres To get started, we need to create a new Postgres instance for us to connect to. For this tutorial, we will be using [Neon Postgres](https://neon.tech/) as they provide free, managed Postgres instances. If you like to host your database somewhere else, you can do that too. 1. Visit https://neon.tech/ and click **Sign up** to sign up with an email, Github, Google, or partner account. After signing up, you are directed to the Neon Console to create your first project. 2. Enter a name for your project, select a Postgres version, provide a database name, and select a region. Generally, you'll want to select the region closest to your application. When you're finished, click **Create project**. 3. You are presented with the connection string for your new project, which you can use to connect to your database. Save the connection string, which looks something like this: ```sh postgres://alex:AbC123dEf@ep-cool-darkness-123456.us-east-2.aws.neon.tech/dbname?sslmode=require ``` You will need the connection string in the next step. ## Create a project in Deno Deploy Next, let's create a project in Deno Deploy and set it up with the requisite environment variables: 1. Go to [https://dash.deno.com/new](https://dash.deno.com/new) (Sign in with GitHub if you didn't already) and click on **Create an empty project** under **Deploy your own code**. 2. Now click on the **Settings** button available on the project page. 3. Navigate to **Environment Variables** Section and add the following secret. - `DATABASE_URL` - The value should be set to the connection string you saved in the last step. ![postgres_env_variable](../docs-images/neon_postgres_env_variable.png) ## Write code that connects to Postgres To read/write to Postgres using the [Neon serverless driver](https://deno.com/blog/neon-on-jsr), first install it using the `deno add` command: ```sh deno add jsr:@neon/serverless ``` This will create or update your `deno.json` file with the dependency: ```json { "imports": { "@neon/serverless": "jsr:@neon/serverless@^0.10.1" } } ``` Now you can use the driver in your code: ```ts import { neon } from "@neon/serverless"; // Get the connection string from the environment variable "DATABASE_URL" const databaseUrl = Deno.env.get("DATABASE_URL")!; // Create a SQL query executor const sql = neon(databaseUrl); try { // Create the table await sql` CREATE TABLE IF NOT EXISTS todos ( id SERIAL PRIMARY KEY, title TEXT NOT NULL ) `; } catch (error) { console.error(error); } ``` ## Deploy application to Deno Deploy Once you have finished writing your application, you can deploy it on Deno Deploy. To do this, go back to your project page at `https://dash.deno.com/projects/`. You should see a couple of options to deploy: - [Github integration](ci_github) - [`deployctl`](./deployctl.md) ```sh deployctl deploy --project= ``` Unless you want to add a build step, we recommend that you select the GitHub integration. For more details on the different ways to deploy on Deno Deploy and the different configuration options, read [here](how-to-deploy). --- # Organizations URL: https://docs.deno.com/deploy/manual/organizations **Organizations** allow you to collaborate with other users. A project created in an organization is accessible to all members of the organization. Users should first signup for Deno Deploy before they can be added to an organization. Currently, all organization members have full access to the organization. They can add/remove members, and create/delete/modify all projects in the organization. ### Create an organization 1. On your Deploy dashboard, click on the organization dropdown in the top left of the screen, in the navigation bar. ![organizations](../docs-images/organizations.png) 2. Select **Organization +**. 3. Enter a name for your organization and click on **Create**. ### Add members 1. Select the desired organization in the organization dropdown in the top left of the screen, in the navigation bar. 2. Click on the **Members** icon button. 3. Under the **Members** panel, click on **+ Invite member**. > **Note:** Users should first signup for Deno Deploy using > [this link](https://dash.deno.com/signin) before you invite them. 4. Enter the GitHub username of the user and click on **Invite**. Deploy will send the user an invite email. They can then can either accept or decline your invite. Once they accept the invite, they're added to your organization and shown in the members panel. Pending invites are displayed in the **Invites** panel. You can revoke pending invites by clicking on the delete icon next to the pending invite. ### Remove members 1. Select the desired organization in the organization dropdown in the top left of the screen, in the navigation bar. 2. Click on the **Members** icon button. 3. In the **Members** panel, click on the delete button beside the user you want to remove. --- # Playgrounds URL: https://docs.deno.com/deploy/manual/playgrounds **Playgrounds** are an easy way to play around with Deno Deploy, and to create small projects. Using playgrounds you can write code, run it, and see the output fully inside the browser. Playgrounds have the full power of Deno Deploy: they support all the same features as a normal project, including environment variables, custom domains, and logs. Playgrounds are also just as performant as all other projects on Deno Deploy: they make full use of our global network to run your code as close to users as possible. - [Creating a playground](#creating-a-playground) - [Using the playground editor](#using-the-playground-editor) - [Making a playground public](#making-a-playground-public) - [Exporting a playground to GitHub](#exporting-a-playground-to-github) ## Creating a playground To create a new playground press the **New Playground** button in the top right corner of the [project overview page](https://dash.deno.com/projects). This will create a new playground with a randomly generated name. You can change this name in the project settings later. ## Using the playground editor The playground editor is opened automatically when you create a new playground. You can also open it by navigating to your project's overview page and clicking the **Edit** button. The editor consists of two main areas: the editor on the left, and the preview panel on the right. The editor is where you write your code, and the preview panel is where you can see the output of your code through a browser window. There is also a logs panel underneath the editor panel on the left side. This panel shows the console output of your code, and is useful for debugging your code. After editing your code, you need to save and deploy it so the preview on the right updates. You can do this by clicking the **Save & Deploy** button in the top right, by pressing Ctrl + S, or opening the command palette with F1 and selecting **Deploy: Save & Deploy**. In the tool bar in the top right of the editor you can see the current deployment status of your project while saving. The preview panel on the right will refresh automatically every time you save and deploy your code. The language dropdown in the top right of the editor allows you to switch between JavaScript, JSX, TypeScript, and TSX. The default selected language is TSX which will work for most cases. ## Making a playground public Playgrounds can be shared with other users by making them public. This means that anyone can view the playground and its preview. Public playgrounds can not be edited by anyone: they can still only be edited by you. Logs are also only shown to you. Users have the option to fork a public playground to make a private copy of it that they can edit. To make a playground public, press the **Share** button in the top tool bar in the editor. The URL to your playground will be copied to your clipboard automatically. You can also change the playground visibility from the playground settings page in the Deno Deploy dashboard. This can be used to change the visibility of a playground from public to private again. ## Exporting a playground to GitHub Playgrounds can be exported to GitHub. This is useful if your project is starting to outgrow the single file limit of the playground editor. Doing this will create a new GitHub repository containing the playground code. This project will be automatically turned into a git project that is linked to this new GitHub repository. Environment variables and domains will be retained. The new GitHub repository will be created in your personal account, and will be set to private. You can change these settings later in the GitHub repository settings. After exporting a playground, you can no longer use the Deno Deploy playground editor for this project. This is a one-way operation. To export the playground visit the playground settings page in the Deno Deploy dashboard or select **Deploy: Export to GitHub** from the command palette (press F1 in the editor). Here you can enter a name for the new GitHub repository. This name will be used to create the repository on GitHub. The repository must not already exist. Press **Export** to export the playground to GitHub. --- # Connect to Postgres URL: https://docs.deno.com/deploy/manual/postgres This tutorial covers how to connect to a Postgres database from an application deployed on Deno Deploy. You can find a more comprehensive tutorial that builds a sample application on top of Postgres [here](../tutorials/tutorial-postgres). ## Setup Postgres > This tutorial will focus entirely on connecting to Postgres unencrypted. If > you would like to use encryption with a custom CA certificate, use the > documentation [here](https://deno-postgres.com/#/?id=ssltls-connection). To get started, we need to create a new Postgres instance for us to connect to. For this tutorial, we will be using [Supabase](https://supabase.com) as they provide free, managed Postgres instances. If you like to host your database somewhere else, you can do that too. 1. Visit https://app.supabase.io/ and click **New project**. 2. Select a name, password, and region for your database. Make sure to save the password, as you will need it later. 3. Click **Create new project**. Creating the project can take a while, so be patient. ## Gather credentials from Postgres Once you've set up your Postgres database, gather your connection information from your Postgres instance. ### Supabase For the Supabase instance above, to get your connection information: 1. Navigate to the **Database** tab on the left. 2. Go to the **Project Settings** >> **Database** and copy the connection string from the **Connection String** >> **URI** field. This is the connection string you will use to connect to your database. Insert the password you saved earlier into this string, and then save the string somewhere - you will need it later. ### psql If you are using psql, you should generally be able to find your connection information by running: ```psql test=# \conninfo ``` Your Postgres connection string will take the form: ```sh postgres://user:password@127.0.0.1:5432/deploy?sslmode=disable ``` ## Create a project in Deno Deploy Next, let's create a project in Deno Deploy and set it up with the requisite environment variables: 1. Go to [https://dash.deno.com/new](https://dash.deno.com/new) (Sign in with GitHub if you didn't already) and click on **+ Empty Project** under **Deploy from the command line**. 2. Now click on the **Settings** button available on the project page. 3. Navigate to **Environment Variables** Section and add the following secrets. - `DATABASE_URL` - The value should be your connection string that you retrieved in the last step. ![postgres_env_variable](../docs-images/postgres_env_variable.png) ## Write code that connects to Postgres To read/write to Postgres, import a suitable Postgres module such as [this one from JSR](https://jsr.io/@bartlomieju/postgres), read the connection string from the environment variables, and create a connection pool. ```ts import { Pool } from "jsr:@bartlomieju/postgres"; // Get the connection string from the environment variable "DATABASE_URL" const databaseUrl = Deno.env.get("DATABASE_URL")!; // Create a database pool with three connections that are lazily established const pool = new Pool(databaseUrl, 3, true); // Connect to the database const connection = await pool.connect(); try { // Create the table await connection.queryObject` CREATE TABLE IF NOT EXISTS todos ( id SERIAL PRIMARY KEY, title TEXT NOT NULL ) `; } finally { // Release the connection back into the pool connection.release(); } ``` ## Deploy application to Deno Deploy Once you have finished writing your application, you can deploy it on Deno Deploy. To do this, go back to your project page at `https://dash.deno.com/projects/`. You should see a couple of options to deploy: - [Github integration](ci_github) - [`deployctl`](./deployctl.md) ```sh deployctl deploy --project= ``` Unless you want to add a build step, we recommend that you select the Github integration. For more details on the different ways to deploy on Deno Deploy and the different configuration options, read [here](how-to-deploy). --- # Pricing and limitations URL: https://docs.deno.com/deploy/manual/pricing-and-limits Please see [our pricing page](https://deno.com/deploy/pricing) for the overview of the available features in all plans. If you have a use case that exceeds any of these limits, [please reach out](mailto:deploy@deno.com). No uptime guarantees are provided during the initial public beta for Deno Deploy. Access to the service will be controlled by [our acceptable use policy](/deploy/manual/acceptable-use-policy). Any user we deem to be in violation of this policy, runs the risk of having their account terminated. ## Maximum size for deployments When uploading assets to a deployment, the total size of all files within the deployment (source files and static files) **should not exceed 1 gigabyte**. ## Memory allocation Applications have a maximum memory allocation of 512MB ## Upload request limits We do not set a limit for the number of upload requests your application may handle as long as your application is within [our acceptable use policy](/deploy/manual/acceptable-use-policy). ## TLS proxying TLS termination is required for outgoing connections to port 443 (the port used for HTTPS). Using [Deno.connect](https://docs.deno.com/api/deno/~/Deno.connect) to connect to these ports is prohibited. If you need to establish a TLS connection to port 443, please use [Deno.connectTls](https://docs.deno.com/api/deno/~/Deno.connectTls) instead. `fetch` is not impacted by this restriction. This restriction is in place because connecting to port 443 without terminating TLS is frequently used in TLS-over-TLS proxies, which are prohibited on Deno Deploy as per [our acceptable use policy](/deploy/manual/acceptable-use-policy). --- # Privacy Policy URL: https://docs.deno.com/deploy/manual/privacy-policy **DENO PRIVACY POLICY** 09 September 2024 Deno Land Inc. (“Deno,” “we,” “us,” or “our”) collects and uses personal information in order to provide its products and services to you. This Privacy Policy (the “Policy”) describes the personal information we collect, the purposes for which we use it, the parties with whom we may share it, and your choices with respect to such information. For purposes of this Privacy Policy, “personal information” means any information that relates to you as an individual and could reasonably be used to identify you. This Privacy Policy applies to our collection and use of personal information through (i) our website at [https://deno.com](https://deno.com) (the “Site”); (ii) any websites, applications or other digital properties that link to this Privacy Policy; and (iii) the products and services (the “Deno Offerings”) we offer to you on our proprietary platform (the “Platform”) via the following websites: - Deno Deploy ([https://deno.com/deploy](https://deno.com/deploy)); and - Deno Subhosting ([https://deno.com/subhosting](https://deno.com/subhosting)). By accessing or using the Site or any other digital property that links to this Privacy Policy, you may learn about Deno and our technology platform, and registered customers may also access the Deno Offerings (collectively, the “Services”). To the extent permitted by applicable law, your use of Deno’ products and services constitutes your acknowledgment and/or consent to the practices described in this Policy. This Privacy Policy incorporates [Deno’s Terms and Conditions](https://docs.deno.com/deploy/manual/terms-and-conditions/) (the “Terms”). Capitalized terms that are not defined in the Privacy Policy have the meaning given to them in the Terms. **I. The Information We Collect, And How We Collect It** We collect the following categories of information, which may include personal information (collectively, the “**Information**”). **1\. Information You Provide To Us** We collect information from and about you directly when you provide it to us. This information may be collected when you contact us, fill out a form, create an account, subscribe to our blog, access or participate on our Sites, respond to surveys, or otherwise interact with us. This information may include: *Contact Information. *We collect your contact information when you voluntarily provide it to us. For example, you may disclose contact information to us via the “Contact” link on our Sites, submit information by mail, telephone, in person or electronically, when signing up for our newsletters and other marketing communications, or when you register to attend an event or program. Contact Information typically includes first name, last name, e-mail address, postal address, organization, telephone number and other information that identifies you or can be used to identify or contact you. _Account Credentials_. When you register to create an account with us, we will collect certain additional personal information, including your name, email address, and potentially other information such as your GitHub user name and public GitHub profile. In addition to Contact Information and Account Credentials, we may collect other kinds of information, such as: - Comments, questions, and requests you may make; - Information about your preferences, such as your preferred methods of communication and the types of information in which you are interested; - Event and service-related information (such as information required for registration, access to premises or online resources, dietary restrictions, and areas of interest); - Audio and visual information, such as photographs, video and voice recordings (e.g., from events you attended with us), or security camera recordings if you visit our premises; - Details of downloads from our Sites; - Records and copies of your correspondence (including email addresses and phone numbers), if you contact us; and - Any other information you voluntarily provide. **2\. Information Obtained From Third Parties** We may receive certain information about you from other sources, including publicly available sources (such as public records and social media platforms), as well as our service providers and marketing partners. When we collect personal information from users and visitors of other sites on which you have interacted with us, we will do so in accordance with the terms of use and privacy policies of those sites and applicable law. We may also receive personal information when you comment on our social media advertisements, post comments about us, or tag us in a public-facing social media post. Personal information may also be collected by the third-party social media sites that host our social media pages. These sites may provide aggregate information and analysis to us about visitors’ use of our social media pages. This allows us to better understand and analyze our user growth, general demographic information about the users of these pages, and interaction with the content that we post. Overall, this information may be used to help us understand the types of visitors and users of our social media pages and use of the content. This Privacy Policy does not cover personal information collected by such third-party social media sites. For more information on their privacy and security practices please review the privacy policies and terms of use on their respective websites. **3\. Information Collected Automatically** We and our service providers may automatically obtain certain information about you, your electronic device, and your interactions with us, including the following: - _Device data_. We may collect data such as the type of device and its operating system and settings, browser type, mobile device carrier, country, IP address, and unique identifiers. - _Internet and other or electronic activity data_. This includes information about your interaction with our Sites, emails, and other online content. - _Tracking Data_. We may collect tracking data using first and third-party cookies, pixels, web server logs, web beacons, and similar data collection and tracking technologies on the Sites, third party websites, apps and online services, and across your devices (such as IP address, browser, type, ISP, platform type, device type). Third parties such as advertising networks and analytics providers may also collect information about your online activities over time and across different websites and devices when you access or use the Sites. **II. How We Use And Share Your Information** Deno uses the Information for the purpose for which it was collected and in a manner that is consistent with this Privacy Policy. These functions include operation, maintenance and improvements to the Sites, providing our products and services, solicitation of your feedback, gaining a better understanding of our customers and visitors of our Sites, responding to your requests and questions, hosting events, and informing you about our organization, products, services, events, and other areas of interest. _Analytics Services_. We may use third-party web analytics services, such as Google Analytics, to help us understand and analyze how Site visitors use our services. For more information on how Google Analytics uses data collected through our Sites, visit [www.google.com/policies/privacy/partners](http://www.google.com/policies/privacy/partners). _Aggregated Data_. We may analyze your personal information in aggregate form which does not identify you personally (“**Aggregated Data**”). The Aggregated Data may be used to operate, maintain, manage, and improve the Sites, shared with our affiliates, agents, and business partners, and otherwise used and disclosed for lawful business purposes. We do not re-identify de-identified or aggregated information. _Service Providers/Vendors_. Like many businesses, we hire other companies to perform certain business-related services. We may disclose personal information to certain types of third party companies but only to the extent needed to enable them to provide such services, for example web hosting, disaster recovery, client survey and marketing, and data storage. _Reorganization_. If, in the future, Deno undergoes a corporate, partnership, or business reorganization, we may transfer the Information, including personal information, to the new or surviving entity.  _Protection of Rights and Compliance_. We may use your Information to protect the rights, privacy or safety of you, us or others; to ensure our compliance with legal and contractual requirements; and to prevent and investigate illegal, unethical, or unauthorized activities (including cyberattacks and identity theft). If Deno intends on using or disclosing your personal information in any manner that is not consistent with this Privacy Policy, you will be informed of such anticipated use prior to or at the time at which the personal information is collected. **III. How We Protect Your Information** We take commercially reasonable steps to protect your personal information from loss, misuse, and unauthorized access, disclosure, alteration, or destruction. Please understand, however, that no security system is impenetrable. We cannot guarantee the security of our databases, nor can we guarantee that the personal information that you supply will not be intercepted while being transmitted to and from us over the Internet. **IV. Data Retention** Deno determines the retention period for all Information based on the purposes for which we collect and/or receive the Information and/or tax, legal and regulatory requirements. In addition to this, we may consider other factors, such as the nature and sensitivity of the data, and whether we can achieve the purpose for which we collected the data through other means. **V. Your Privacy Choices** **1\. Your Information** You may request access to, correction of, or deletion of the personal information we maintain about you, and we will endeavor to respond promptly to your request. In order to make such a request, please contact us as indicated below. **2\. Marketing Communications** You may opt-out of marketing-related emails by clicking on the “unsubscribe” link located on the bottom of any marketing email or emailing us at [support@deno.com](mailto:support@deno.com). We will use commercially reasonable efforts to process such requests in a timely manner. Please note that even if you opt-out of marketing-related emails, you will continue to receive service-related and other non-marketing emails. **3\. Tracking Technology** You can choose not to permit tracking technologies, such as cookies and web beacons, when you use our services, but blocking some types of these tracking technologies may interfere with your experience. _Browser-Based Opt-Outs_. You may be able to disable tracking technologies using your web browser settings. Please review your browser’s instructions or visit [All About Cookies](https://allaboutcookies.org/) for general information. Note that your web browser may have settings that allow you to transmit a “Do Not Track” signal when you use online services. Like many websites, our Sites are not currently designed to respond to “Do Not Track” signals received from browsers. _Self-Regulatory Program Opt-Outs_. Two self-regulatory programs are available to help you control the use of tracking technologies on your browsers — the [Digital Advertising Alliance](https://digitaladvertisingalliance.org/) and the [Network Advertising Initiative](https://thenai.org/). Both programs help to regulate vendors in the digital advertising space. One function of their self-regulatory programs is to give you the ability to opt out of targeted (or interest-based) advertising, including the use of tracking technologies, from their member companies. You can visit the Digital Advertising Alliance’s Your Ad Choices website to opt out of targeted advertising for participating vendors. The Network Advertising Initiative similarly assists with opt outs through their Opt Out of Interest-Based Advertising webpage. _Google Analytics Opt-Out._ To opt out of Google Analytics cookies, visit Google’s [My Ad Center](https://myadcenter.google.com/personalizationoff) and/or download the Google Analytics Opt-Out Browser Add-On. **VI. Children** We do not knowingly collect personal information from children under the age of 18 through the Sites. If you are under 18, please do not give us any personal information. We encourage parents and legal guardians to monitor their children’s Internet usage and to help enforce our Privacy Policy by instructing their children never to provide personal information through the Sites without their permission. If you have reason to believe that a child under the age of 18 has provided personal information to us, please contact us, at [support@deno.com](mailto:support@deno.com) and we will endeavor to delete that information from our databases. **VII. External Websites** The Sites may contain links to third-party websites. These third-party sites may collect information about you if you click on a link. We have no control over the privacy practices or the content of these websites. As such, we are not responsible for the content or the privacy policies of those third-party websites. You should check the applicable third-party privacy policy and terms of use when visiting any other websites. **VIII. Important Notice To Non-U.S. Residents** The Sites are hosted in and provided from the United States and other countries. If you are located outside of the United States, please be aware that any information you provide to us may be transferred to the United States or other countries where the privacy laws may not be as protective as those in your country of origin. If you are located outside the United States and choose to use the Sites, you consent to any transfer and processing of your personal information in accordance with this Privacy Policy, and you do so at your own risk. **IX. Notice To California Residents** Pursuant to Section 1798.83 of the California Civil Code, residents of California have the right to obtain certain information about the types of personal information that companies with whom they have an established business relationship (and that are not otherwise exempt) have shared with third parties for direct marketing purposes during the preceding calendar year, including the names and addresses of those third parties, and examples of the types of services or products marketed by those third parties. In order to submit such a request, please contact us using the contact information provided at the end of this document. Please note, however, that we do not share, nor have we shared in the past, personal information with third parties for direct marketing purposes. **X. Notice To Nevada Residents** If you are a resident of Nevada, you have the right to opt-out of the sale of personal information to third parties. You can exercise this right by contacting us at [support@deno.com](mailto:support@deno.com) with the subject line “Nevada Do Not Sell Request” and providing us with your name and the email address. Please note, however, that we do not sell any personal information to third parties. **XI. Changes To This Privacy Policy** This Privacy Policy is effective as of the date stated at the top of this Privacy Policy. We may change this Privacy Policy from time to time. Any such changes will be posted on the Sites. By accessing the Sites after we make any such changes to this Privacy Policy, you are deemed to have accepted such changes. Please be aware that, to the extent permitted by applicable law, our use of the Information is governed by the Privacy Policy in effect at the time we collect the Information. Please refer back to this Privacy Policy on a regular basis. **XII. How To Contact Us** Please reach out to [support@deno.com](mailto:support@deno.com) for any questions, complaints, or requests regarding this Privacy Policy, and include in the subject line “Privacy Policy", or contact us by mail at: Deno Land Inc.\ 1111 6th Ave Ste 550\ PMB 702973\ San Diego CA, 92101\ USA **© 2024 Deno Land Inc. All rights reserved.** --- # Regions URL: https://docs.deno.com/deploy/manual/regions Deno Deploy deploys your code throughout the world. Each new request is served from the closest region to your user. Deploy is presently located in the following regions: - Singapore (`asia-southeast1`) - London (`europe-west2`) - Frankfurt (`europe-west3`) - Sao Paolo (`southamerica-east1`) - North Virginia (`us-east4`) - California (`us-west2`) This list will be maintained to reflect the latest summary of our regions. Code is deployed to all regions and is served from the region closest to the end user to minimize latency. It is not currently possible to restrict the regions in which your code is deployed. --- # Local development URL: https://docs.deno.com/deploy/manual/running-scripts-locally For local development you can use the `deno` CLI. To install `deno`, follow the instructions in the [Deno manual](https://deno.land/manual/getting_started/installation). After installation, you can run your scripts locally: ```shell $ deno run --allow-net=:8000 https://deno.com/examples/hello.js Listening on http://localhost:8000 ``` To watch for file changes add the `--watch` flag: ```shell $ deno run --allow-net=:8000 --watch ./main.js Listening on http://localhost:8000 ``` For more information about the Deno CLI, and how to configure your development environment and IDE, visit the Deno Manual's [Getting Started][manual-gs] section. [manual-gs]: https://deno.land/manual/getting_started --- # Security and responsible disclosure URL: https://docs.deno.com/deploy/manual/security We consider the security of our systems, and all data controlled by those systems a top priority. No matter how much effort we put into system security, it is still possible that security vulnerabilities are present. We appreciate investigative work into system security carried out by well-intentioned, ethical security researchers. If you discover a vulnerability, however small, we would like to know about it so we can address it with appropriate measures, as quickly as possible. This page outlines the method we use to work with the security research community to address our system security. ## Reporting a vulnerability Please email you findings to security@deno.com. We strive to resolve all problems as quickly as possible, and are more than happy to play an active role in publication of writeups after the problem is resolved. ## Please do the following: - Do not take advantage of the vulnerability or problem you have discovered. For example only download data that is necessary to demonstrate the vulnerability - do not download any more. Also do not delete, modify, or view other people's data. - Do not publish or reveal the problem until it has been resolved. - Do not use attacks on physical security, social engineering, distributed denial of service, spam or applications of third parties. - Do provide sufficient information to reproduce the problem, so we will be able to resolve it as quickly as possible. Usually, the IP address or the URL of the affected system and a description of the vulnerability will be sufficient, but complex vulnerabilities may require further explanation. ## Our commitment - If you act in accordance with this policy, we will not take legal action against you in regard to your report. - We will handle your report with strict confidentiality, and not pass on your personal details to third parties without your permission. --- # Terms and Conditions URL: https://docs.deno.com/deploy/manual/terms-and-conditions **DENO TERMS AND CONDITIONS** 09 September 2024 These Terms and Conditions (these “Terms”) are a legal agreement between you and Deno Land Inc. (“Deno,” “we,” “us,” or “our”). They specify the terms under which you may access and use (i) our website at [https://deno.com](https://deno.com) (the “Site”); (ii) any websites, applications or other digital properties that link to these Terms; and (iii) the products and services (the “Deno Offerings”) we offer to you on our proprietary platform (the “Platform”) via the following websites: - Deno Deploy ([https://deno.com/deploy](https://deno.com/deploy)); and - Deno Subhosting ([https://deno.com/subhosting](https://deno.com/subhosting)). By accessing or using the Site or any other digital property that links to these Terms, you may learn about Deno and our technology platform, and registered customers may also access the Deno Offerings (collectively, the “Services”). PLEASE READ THESE TERMS CAREFULLY. BY ACCESSING AND/OR USING THE SERVICES, YOU ACKNOWLEDGE THAT YOU HAVE READ, UNDERSTOOD, AND AGREE TO BE LEGALLY BOUND BY THESE TERMS, THE DATA PROCESSING ADDENDUM (THE “DPA”), AND THE TERMS AND CONDITIONS OF OUR PRIVACY POLICY (THE “PRIVACY POLICY”), WHICH ARE HEREBY INCORPORATED INTO THESE TERMS AND MADE A PART HEREOF BY REFERENCE (COLLECTIVELY, THE “AGREEMENT”). IF YOU DO NOT AGREE TO ANY OF THE TERMS IN THIS AGREEMENT, THEN PLEASE DO NOT USE THE SERVICES. If you accept or agree to the Agreement on behalf of a company or other legal entity, you represent and warrant that you have the authority to bind that company or other legal entity to the Agreement and, in such event, “you” and “your” will refer and apply to that company or other legal entity. We reserve the right, at our sole discretion, to modify, discontinue, or terminate the availability of any Services, or modify this Agreement, at any time and without prior notice. We encourage you to check these Terms and the “Last Update” date above whenever you access or use the Services. By continuing to access or use the Services after we have posted a modification to these Terms, you are indicating that you agree to be bound by the modified Agreement. If the modified Agreement is not acceptable to you, your only recourse is to cease accessing or using the Services. Deno also offers fee-based products and services (including, from time to time, as free trials), which may offer access to certain data products and/or services (“Paid Products”). We provide access to and use of our Paid Products pursuant to commercial agreements, associated with the applicable Paid Products made available to you at the time of purchase (each, a “Commercial Agreement”). If there is a conflict between these Terms and terms and conditions of the applicable Commercial Agreement associated with the Paid Products you are purchasing, the terms and conditions of the Commercial Agreement will take precedence with respect to the use of or access to such Paid Products. Capitalized terms not defined in these Terms shall have the meaning set forth in our Privacy Policy. **THE SECTIONS BELOW TITLED “BINDING ARBITRATION” AND “CLASS ACTION WAIVER” CONTAIN A BINDING ARBITRATION AGREEMENT AND CLASS ACTION WAIVER. THEY AFFECT YOUR LEGAL RIGHTS. PLEASE READ THEM CAREFULLY.** 1. **DESCRIPTION OF THE SERVICES; RIGHT TO ACCESS AND USE THE SERVICES** **Deno Deploy** and **Deno Subhosting** are globally distributed platforms for serverless JavaScript applications. Your JavaScript, TypeScript, and WebAssembly code runs on managed servers geographically close to your users, enabling low latency and faster response times. Deploy and Subhosting applications run on fast, light-weight V8 isolates rather than virtual machines, powered by the Deno runtime. Subject to the terms and conditions of this Agreement, Deno hereby grants you during the term of this Agreement a limited, non-exclusive, non-transferable, non-sublicensable, revocable right, to access and use the Services solely for your internal business purposes. Deno reserves the right to, at any time, and without notice or liability to you: 1. Block and disable any deployments that, for any reason, make the Platform unstable; 2. Change the regions in which the Services run, 3. Change which features are supported by the Services; and 4. Modify or discontinue the availability of any other feature, function, or content relating to the Services. You agree that we will not be liable to you or to any third party for any modification, suspension, or discontinuance of the Services or any part thereof. You are free to stop using the Services at any time. 2. **ACCOUNT CREDENTIALS** In order to use the Deno Offerings, you must be an “Authorized User”. To become an Authorized User, you need to create an account on the Platform, and authenticate via GitHub (collectively, the “Account Credentials”). When creating the account, each Authorized User must provide true, accurate, current, and complete information. Each Account Credential can be used by only one Authorized User. Each Authorized User is responsible for the confidentiality and use of his/her Account Credentials, including all activities that are associated with his/her Account Credentials. Authorized Users must promptly inform us of any need to deactivate any Account Credentials. Deno is under no obligation to accept any individual as Authorized User, and may accept or reject any registration in its sole and complete discretion. We have the right to disable any Account Credentials at any time for any reason, including if in our sole discretion if we believe that you have failed to comply with these Terms. 3. **USE OF PERSONAL INFORMATION** Your use of the Services may involve the transmission to us of certain personal information. Our policies with respect to the collection and use of such personal information are governed according to our Privacy Policy, which is hereby incorporated by reference in its entirety. 4. **INTELLECTUAL PROPERTY** The Services may contain material, such as software, text, graphics, images, sound recordings, audiovisual works, and other material provided by or on behalf of Deno (collectively referred to as the “Content”). The Content may be owned by us or by third parties. The Content is protected under both United States and foreign laws. Unauthorized use of the Content may violate copyright, trademark, and other laws. You have no rights in or to the Content, and you will not use the Content except as permitted under this Agreement. No other use is permitted without prior written consent from us. You must retain all copyright and other proprietary notices contained in the original Content on any copy you make of the Content. You may not sell, transfer, assign, license, sublicense, or modify the Content or reproduce, display, publicly perform, make a derivative version of, distribute, or otherwise use the Content in any way for any public or commercial purpose. The use or posting of the Content on any other website or in a networked computer environment for any purpose is expressly prohibited. If you violate any part of this Agreement, your permission to access and/or use the Content, and the Services automatically terminates and you must immediately destroy any copies you have made of the Content. The trademarks, service marks, and logos of Deno (the “Deno Trademarks”) used and displayed on the Services are registered and unregistered trademarks or service marks of Deno. Other company, product, and service names located on the Services may be trademarks or service marks owned by others (the “Third-Party Trademarks,” and, collectively with Deno Trademarks, the “Trademarks”). Nothing on the Services should be construed as granting, by implication, estoppel, or otherwise, any license or right to use the Trademarks, without our prior written permission specific for each such use. Use of the Trademarks as part of a link to or from any website is prohibited unless establishment of such a link is approved in advance by us in writing. All goodwill generated from the use of Deno Trademarks inures to our benefit. Elements of the Services are protected by trade dress, trademark, unfair competition, and other state and federal laws and may not be copied or imitated in whole or in part, by any means, including, but not limited to, the use of framing or mirrors. None of the Content may be retransmitted without our express, written consent for each and every instance. 5. **USER DATA; USAGE DATA; AGGREGATE DATA** For purposes of this Agreement, “User Data” means (i) any data and information that we ingest by connecting to Authorized Users’ business systems, including but not limited to event logs; and (ii) any data and information that Authorized Users submit through the Services; and “Usage Data” means anonymous, analytical data that Deno collects concerning the performance and your use of the Services, including, without limitation, date and time that you access the Services, the portions of the Services visited, the frequency and number of times such pages are accessed, the number of times the Services is used in a given time period and other usage and performance data. As between the parties, Authorized Users own all right, title, and interest in and to User Data, including all modifications, improvements, adaptations, enhancements, or translations made thereto, and all intellectual rights therein. Authorized Users hereby grant Deno a non-exclusive, worldwide, fully paid-up, royalty-free right and license, with the right to grant sublicenses, to reproduce, execute, use, store, archive, modify, perform, display and distribute User Data: (i) during the term of this Agreement, in furtherance of Deno’ obligations hereunder; and (ii) for Deno’s internal business purposes, including using such data to analyze, update, and improve the Services and Deno’s analytics capabilities and for benchmarking purposes. Notwithstanding anything to the contrary herein, we may use, and may permit our third-party service providers to access and use, User Data, as well as any Usage Data that we may collect, in an anonymous and aggregated form (“Aggregate Data”) for the purposes of operating, maintaining, managing, and improving our products and services including the Services. Aggregate Data does not identify Authorized Users or any individual. You hereby agree that we may collect, use, publish, disseminate, transfer, and otherwise exploit such Aggregate Data. 6. **FEES** Deno offers and Authorized Users can purchase a monthly or annual subscription for the Services (“Subscription”) for a fee set forth on our website (the “Subscription Fee”). Deno may add new fees and charges, or amend fees and charges, at any time in its sole discretion. Payment for a Subscription is due immediately upon making a purchase for a subscription. By making a purchase, you agree to pay Deno, through our third-party payment processor (“Third-Party Payment Processor”), all charges at the fees then in effect for Subscriptions. Any information you provide to the Third-Party Payment Processor will be processed by such Third-Party Payment Processor in accordance with its privacy policy and terms of use. YOU MUST PROVIDE CURRENT, COMPLETE, AND ACCURATE INFORMATION FOR YOUR ACCOUNT, AND PROMPTLY UPDATE ALL INFORMATION TO KEEP SUCH ACCOUNT INFORMATION CURRENT, COMPLETE, AND ACCURATE (SUCH AS A CHANGE IN BILLING ADDRESS, CREDIT CARD NUMBER, OR CREDIT CARD EXPIRATION DATE). FURTHER, YOU MUST PROMPTLY NOTIFY US IF A PAYMENT METHOD IS CANCELED (E.G., FOR LOSS OR THEFT) OR IF YOU BECOME AWARE OF A POTENTIAL BREACH OF SECURITY, SUCH AS THE UNAUTHORIZED DISCLOSURE OR USE OF YOUR USERNAME OR PASSWORD. CHANGES TO SUCH INFORMATION CAN BE MADE THROUGH YOUR ACCOUNT. By purchasing a Subscription, you acknowledge that your Subscription has an initial and recurring payment charge at the then-current Subscription rate, and you agree that Deno may submit monthly charges, in advance to your chosen payment method without further authorization from you, until you provide notice that you wish to cancel your Subscription or to change your payment method. You further accept responsibility for all recurring charges prior to cancellation, including, where applicable, any charges processed by Deno after the expiration date of your payment card. You may change or terminate your Subscription by emailing us at [support@deno.com](mailto:support@deno.com). If you terminate your Subscription, you may use your Subscription until the end of the then-current billing cycle, and the Subscription will not be renewed after that period expires. Deno does not refund any pre-paid portion of the Subscription fee. Deno may immediately terminate or suspend your Subscription for any reason or no reason in accordance with these Terms, including for failure to pay the applicable fees when due. If we terminate or suspend your Subscription, your right to use any software or content provided in connection with the Subscription is also terminated or suspended (as applicable). From time to time, Deno may offer free trial of the Services. Deno reserves the right in its sole discretion to stop offering free trial of the Services at any time without any liability to you. 7. **COMMUNITY GUIDELINES** By accessing and/or using the Services, you hereby agree to comply with the following guidelines: - You will not use the Services for any unlawful purpose; - You will not access or use the Services to collect any market research for a competing businesses; - You will not upload, post, e-mail, transmit, or otherwise make available any content that infringes any copyright, trademark, right of publicity, or other proprietary rights of any person or entity; - You will not impersonate any person or entity or falsely state or otherwise misrepresent your affiliation with a person or entity; - You will not decompile, reverse engineer, disassemble, or otherwise attempt to discern the source code or interface protocols of any software or other products or processes accessible through the Services; - You will not remove or modify any proprietary markings or restrictive legends placed on the Services; - You will not use the Services, or any portion or component thereof in violation of any applicable law, in order to build a competitive product or service, or for any purpose not specifically permitted in these Terms; - You will not cover, obscure, block, or in any way interfere with any advertisements and/or safety features on the Services; - You will not circumvent, remove, alter, deactivate, degrade, or thwart any of the protections in the Services; - You will not introduce, post, or upload to the Services any Harmful Code. As used herein, “Harmful Code” means computer code, programs, or programming devices that are intentionally designed to disrupt, modify, access, delete, damage, deactivate, disable, harm, or otherwise impede in any manner, including aesthetic disruptions or distortions, the operation of the Services, or any other associated software, firmware, hardware, computer system, or network (including, without limitation, “Trojan horses,” “viruses,” “worms,” “time bombs,” “time locks,” “devices,” “traps,” “access codes,” or “drop dead” or “trap door” devices) or any other harmful, malicious, or hidden procedures, routines or mechanisms that would cause the Services to cease functioning or to damage or corrupt data, storage media, programs, equipment, or communications, or otherwise interfere with the operations of the Services; - You will not take any action that imposes or may impose (in our sole discretion) an unreasonable or disproportionately large load on our technical infrastructure; and - You will not interfere with or attempt to interrupt the proper operation of the Services through the use of any virus, device, information collection or transmission mechanism, software or routine, or access or attempt to gain access to any data, files, or passwords related to the Services through hacking, password or data mining, or any other means. Although we are not obligated to monitor access to or use of the Services, we have the right to do so for the purpose of operating them, to ensure compliance with these Terms, and to comply with applicable law or other legal requirements. We have the right to investigate violations of these Terms or conduct that affects the Services. We may also consult and cooperate with law enforcement authorities to prosecute Users who violate the law. If you find something that violates our User Guidelines, please let us know, and we will review it. 8. **LINKING AND CITATION OF CONTENT** Deno does not object to links on third-party Services to our homepage in an appropriate context. However, “framing” or “mirroring” the Services or the Content is prohibited without the prior express written consent of Deno. 9. **RESTRICTIONS** The Services are available only for individuals aged 18 years or older. If you are under 18 years of age, then please do not access and/or use the Services. By entering into this Agreement, you represent and warrant that you are 18 years or older. 10. **FEEDBACK** We welcome and encourage you to provide feedback, comments, and suggestions for improvements to the Services and our services (“Feedback”). Although we encourage you to e-mail us, we do not want you to, and you should not, e-mail us any content that contains confidential information. With respect to any Feedback you provide, we shall be free to use and disclose any ideas, concepts, know-how, techniques, or other materials contained in your Feedback for any purpose whatsoever, including, but not limited to, the development, production and marketing of products and services that incorporate such information, without compensation or attribution to you. 11. **NO WARRANTIES; LIMITATION OF LIABILITY** THE SERVICES AND THE CONTENT ARE PROVIDED ON AN “AS IS” AND “AS AVAILABLE” BASIS, AND NEITHER DENO NOR DENO’S SUPPLIERS MAKE ANY WARRANTIES WITH RESPECT TO THE SAME OR OTHERWISE IN CONNECTION WITH THIS AGREEMENT, AND DENO HEREBY DISCLAIMS ANY AND ALL EXPRESS, IMPLIED, OR STATUTORY WARRANTIES, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AVAILABILITY, ERROR-FREE OR UNINTERRUPTED OPERATION, AND ANY WARRANTIES ARISING FROM A COURSE OF DEALING, COURSE OF PERFORMANCE, OR USAGE OF TRADE. TO THE EXTENT THAT DENO AND DENO’S SUPPLIERS MAY NOT AS A MATTER OF APPLICABLE LAW DISCLAIM ANY IMPLIED WARRANTY, THE SCOPE AND DURATION OF SUCH WARRANTY WILL BE THE MINIMUM PERMITTED UNDER SUCH LAW. WITHOUT LIMITING THE FOREGOING, WE DO NOT WARRANT, GUARANTEE OR MAKE ANY REPRESENTATION, NOR SHALL WE BE RESPONSIBLE FOR (A) THE CORRECTNESS, ACCURACY, RELIABILITY, COMPLETENESS OR CURRENCY OF THE SERVICES; OR (B) ANY RESULTS ACHIEVED OR ACTION TAKEN BY YOU IN RELIANCE ON THE SERVICES OR THE CONTENT OR ALERTS PROVIDED THROUGH THE SERVICES. ANY DECISION, ACT OR OMISSION OF YOURS THAT IS BASED ON THE SERVICES OR THE CONTENT OR ALERTS PROVIDED THROUGH THE SERVICES IS AT YOUR OWN AND SOLE RISK. THE SERVICES AND THE CONTENT AND ALERTS PROVIDED THROUGH THE SERVICES IS PROVIDED AS A CONVENIENCE ONLY AND DOES NOT REPLACE THE NEED TO REVIEW ITS ACCURACY, COMPLETENESS AND CORRECTNESS. IN CONNECTION WITH ANY WARRANTY, CONTRACT, OR COMMON LAW TORT CLAIMS: (I) WE SHALL NOT BE LIABLE FOR ANY INCIDENTAL OR CONSEQUENTIAL DAMAGES, LOST PROFITS, OR DAMAGES RESULTING FROM LOST DATA OR BUSINESS INTERRUPTION RESULTING FROM THE USE OR INABILITY TO ACCESS AND USE THE SERVICES, EVEN IF WE HAVE BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES; AND (II) ANY DIRECT DAMAGES THAT YOU MAY SUFFER AS A RESULT OF YOUR USE OF THE SERVICES, SHALL BE LIMITED TO THE GREATER OF (I) MONIES YOU HAVE PAID US IN CONNECTION WITH YOUR USE OF THE SERVICES DURING THE TWELVE (12) MONTHS IMMEDIATELY PRECEDING THE DATE THAT GAVE RISE TO THE CLAIM OR (II) ONE HUNDRED DOLLARS ($100). 12. **EXTERNAL SITES** The Services may contain links to third-party websites (“External Sites”). These links are provided solely as a convenience to you and not as an endorsement by us of the content on such External Sites. The content of such External Sites is developed and provided by others. You should contact the website administrator or webmaster for those External Sites if you have any concerns regarding such links or any content located on such External Sites. We are not responsible for the content of any linked External Sites and do not make any representations regarding the content or accuracy of materials on such External Sites. You should take precautions when downloading files from all websites to protect your computer from viruses and other destructive programs. If you decide to access linked External Sites, you do so at your own risk. 13. **REPRESENTATIONS AND WARRANTIES** You represent and warrant that you have: (i) all rights and permissions necessary to provide us with or grant us access to and use of User Data, and (ii) obtained all necessary and appropriate consents, permissions, and authorizations in accordance with all applicable laws and regulations with respect to User Data provided hereunder. 14. **INDEMNIFICATION** You will indemnify, defend, and hold Deno, its affiliates, and our and their respective shareholders, members, officers, directors, employees, agents, and representatives (collectively, “Deno Indemnitees”) harmless from and against any and all damages, liabilities, losses, costs, and expenses, including reasonable attorney’s fees (collectively, “Losses”) incurred by any Deno Indemnitee in connection with a third-party claim, action, or proceeding (each, a “Claim”) arising from your (i) breach of this Agreement, including but not limited to, any breach of your representations and warranties; (ii) misuse of the Services, and/or the Content; (iii) negligence, gross negligence, willful misconduct, fraud, misrepresentation or violation of law; or (iv) violation of any third-party right, including without limitation any copyright, trademark, property, or privacy right; _provided_, _however_, that the foregoing obligations shall be subject to our: (i) promptly notifying you of the Claim; (ii) providing you, at your expense, with reasonable cooperation in the defense of the Claim; and (iii) providing you with sole control over the defense and negotiations for a settlement or compromise. 15. **COMPLIANCE WITH APPLICABLE LAWS** The Services are based in the United States. We make no claims concerning whether the Services may be viewed or be appropriate for use outside of the United States. If you access the Services from outside of the United States, you do so at your own risk. Whether inside or outside of the United States, you are solely responsible for ensuring compliance with the laws of your specific jurisdiction. 16. **TERM; TERMINATION** These Terms, and your right to access and use the Services, will commence upon your acceptance of these Terms and will continue for the period of your Subscription and/or use of the Services. We reserve the right, in our sole discretion, to restrict, suspend, or terminate these Terms and your access to all or any part of the Services, at any time and for any reason without prior notice or liability. We reserve the right to change, suspend, or discontinue all or any part of the Services at any time without prior notice or liability. The Sections “Description of the Services; Right to Use and Access the Service;” “Use of Personal Information,” “Intellectual Property,” “Feedback,” “No Warranties; Limitation of Liability,” “Indemnification,” “Compliance with Applicable Laws,” “Term; Termination,” “Binding Arbitration,” “Class Action Waiver,” “Equitable Relief,” “Controlling Law; Exclusive Forum,” and “Miscellaneous” shall survive the termination of these Terms. 17. **BINDING ARBITRATION** In the event of a dispute arising under or relating to this Agreement, and/or the Services (each, a “Dispute”), such dispute will be finally and exclusively resolved by binding arbitration governed by the Federal Arbitration Act (“FAA”). NEITHER PARTY SHALL HAVE THE RIGHT TO LITIGATE SUCH CLAIM IN COURT OR TO HAVE A JURY TRIAL, EXCEPT EITHER PARTY MAY BRING ITS CLAIM IN ITS LOCAL SMALL CLAIMS COURT, IF PERMITTED BY THAT SMALL CLAIMS COURT RULES AND IF WITHIN SUCH COURT’S JURISDICTION. ARBITRATION IS DIFFERENT FROM COURT, AND DISCOVERY AND APPEAL RIGHTS MAY ALSO BE LIMITED IN ARBITRATION. All disputes will be resolved before a neutral arbitrator selected jointly by the parties, whose decision will be final, except for a limited right of appeal under the FAA. The arbitration shall be commenced and conducted by JAMS pursuant to its then current Comprehensive Arbitration Rules and Procedures and in accordance with the Expedited Procedures in those rules, or, where appropriate, pursuant to JAMS’ Streamlined Arbitration Rules and Procedures. All applicable JAMS’ rules and procedures are available at the JAMS website [www.jamsadr.com](http://www.jamsadr.com). Each party will be responsible for paying any JAMS filing, administrative, and arbitrator fees in accordance with JAMS rules. Judgment on the arbitrator’s award may be entered in any court having jurisdiction. This clause shall not preclude parties from seeking provisional remedies in aid of arbitration from a court of appropriate jurisdiction. The arbitration may be conducted in person, through the submission of documents, by phone, or online. If conducted in person, the arbitration shall take place in the United States county where you reside. The parties may litigate in court to compel arbitration, to stay a proceeding pending arbitration, or to confirm, modify, vacate, or enter judgment on the award entered by the arbitrator. The parties shall cooperate in good faith in the voluntary and informal exchange of all non-privileged documents and other information (including electronically stored information) relevant to the Dispute immediately after commencement of the arbitration. As set forth in Section 18 below, nothing in this Agreement will prevent us from seeking injunctive relief in any court of competent jurisdiction as necessary to protect our proprietary interests. 18. **CLASS ACTION WAIVER** You agree that any arbitration or proceeding shall be limited to the Dispute between us and you individually. To the full extent permitted by law, (i) no arbitration or proceeding shall be joined with any other; (ii) there is no right or authority for any Dispute to be arbitrated or resolved on a class action-basis or to utilize class action procedures; and (iii) there is no right or authority for any Dispute to be brought in a purported representative capacity on behalf of the general public or any other persons. YOU AGREE THAT YOU MAY BRING CLAIMS AGAINST US ONLY IN YOUR INDIVIDUAL CAPACITY AND NOT AS A PLAINTIFF OR CLASS MEMBER IN ANY PURPORTED CLASS OR REPRESENTATIVE PROCEEDING. 19. **EQUITABLE RELIEF** You acknowledge and agree that in the event of a breach or threatened violation of our intellectual property rights and confidential and proprietary information by you, we will suffer irreparable harm and will therefore be entitled to injunctive relief to enforce this Agreement. We may, without waiving any other remedies under this Agreement, seek from any court having jurisdiction any interim, equitable, provisional, or injunctive relief that is necessary to protect our rights and property pending the outcome of the arbitration referenced above. You hereby irrevocably and unconditionally consent to the personal and subject matter jurisdiction of the federal and state courts in the State of New York for purposes of any such action by us. 20. **CONTROLLING LAW; EXCLUSIVE FORUM** The Agreement and any action related thereto will be governed by the laws of the State of New York without regard to its conflict of laws provisions. The parties hereby consent and agree to the exclusive jurisdiction of the state and federal courts located in the State of New York for all suits, actions, or proceedings directly or indirectly arising out of or relating to this Agreement, and waive any and all objections to such courts, including but not limited to, objections based on improper venue or inconvenient forum, and each party hereby irrevocably submits to the exclusive jurisdiction of such courts in any suits, actions, or proceedings arising out of or relating to this Agreement 21. **MISCELLANEOUS** Notwithstanding anything to the contrary set forth in these Terms, each party may during the term of this Agreement, use the other party’s name and/or logo for marketing and promotional purposes, including, without limitation, identifying Authorized Users as a customer of Deno on Deno’s website or elsewhere. You may not assign any of your rights, duties, or obligations under these Terms to any person or entity, in whole or in part, without written consent from Deno. Our failure to act on or enforce any provision of the Agreement shall not be construed as a waiver of that provision or any other provision in this Agreement. No waiver shall be effective against us unless made in writing, and no such waiver shall be construed as a waiver in any other or subsequent instance. Except as expressly agreed by us and you in writing, the Agreement constitutes the entire agreement between you and us with respect to the subject matter, and supersedes all previous or contemporaneous agreements, whether written or oral, between the parties with respect to the subject matter. The section headings are provided merely for convenience and shall not be given any legal import. This Agreement will inure to the benefit of our successors, assigns, licensees, and sublicensees. **Copyright 2024 Deno Land Inc. All rights reserved.** --- # Deno Deploy Use Cases URL: https://docs.deno.com/deploy/manual/use-cases Some popular use-cases for Deno currently are: - [Middleware](#middleware) - [API servers](#api-servers) - [Full websites](#full-websites) ## Middleware Middleware refers to bits of code that execute before and after the request gets to the application server. You'll be writing middleware if you want to execute some JavaScript or any other code very fast, early in the request. By deploying your middleware code at the edge, Deno Deploy ensures the best performance for your app. Some examples include: - setting a cookie - serving different versions of a site depending on geolocation - path rewriting - redirecting requests - dynamically changing the HTML on its way back from the server before it gets to the user. Deno Deploy is a good alternative to other platforms you might be using to host your middleware right now, for example: - Cloudflare Workers - AWS Lambda@Edge - Traditional load balancers like nginx - Custom rules ## API servers Deno is also a great fit for API servers. By deploying these servers "at the edge", closer to clients who are using them, Deno Deploy is able to offer lower latency, improved performance, and reduced bandwidth costs compared to traditional hosting platforms like Heroku or even modern centralized hosting services like DigitalOcean. ## Full websites We foresee a future where you can actually write your entire website on edge functions. Some examples of sites that are already doing this include: - [blog](https://github.com/ry/tinyclouds) - [chat](https://github.com/denoland/showcase_chat) - [calendly clone](https://github.com/denoland/meet-me) --- # Discord Slash Command URL: https://docs.deno.com/deploy/tutorials/discord-slash Discord has a new feature called **Slash Commands**. They allow you to type `/` followed by a command name to perform some action. For example, you can type `/giphy cats` (a built-in command) to get some cat gifs. Discord Slash Commands work by making a request to a URL whenever someone issues a command. You don't need your app to be running all the time for Slash Commands to work, which makes Deno Deploy a perfect solution to build such commands. In this post, let's see how we can build a hello world Slash Command using Deno Deploy. ## **Step 1:** Create an application on Discord Developer Portal 1. Go to [https://discord.com/developers/applications](https://discord.com/developers/applications) (login using your discord account if required). 2. Click on **New Application** button available at left side of your profile picture. 3. Name your application and click on **Create**. 4. Go to **Bot** section, click on **Add Bot**, and finally on **Yes, do it!** to confirm. That's it. A new application is created which will hold our Slash Command. Don't close the tab as we need information from this application page throughout our development. ## **Step 2:** Register Slash command with Discord app Before we can write some code, we need to curl a Discord endpoint to register a Slash Command in our app. Fill `BOT_TOKEN` with the token available in the **Bot** section and `CLIENT_ID` with the ID available on the **General Information** section of the page and run the command on your terminal. ```sh BOT_TOKEN='replace_me_with_bot_token' CLIENT_ID='replace_me_with_client_id' curl -X POST \ -H 'Content-Type: application/json' \ -H "Authorization: Bot $BOT_TOKEN" \ -d '{"name":"hello","description":"Greet a person","options":[{"name":"name","description":"The name of the person","type":3,"required":true}]}' \ "https://discord.com/api/v8/applications/$CLIENT_ID/commands" ``` This will register a Slash Command named `hello` that accepts a parameter named `name` of type string. ## **Step 3:** Create and deploy the hello world Slash Command on Deno Deploy Next, we need to create a server to respond to Discord when it makes a POST request with someone's slash command. 1. Navigate to https://dash.deno.com/new and click **Play** under the **Playground** card. 2. On the next page, in the editor, click the **Settings** icon on the top menu. In the modal that pops up, select **+ Add Variable**. 3. Input `DISCORD_PUBLIC_KEY` as KEY. The VALUE should be the public key available in **General Information** section in the Discord application page. 4. Copy and paste the following code into the editor: ```ts // Sift is a small routing library that abstracts away details like starting a // listener on a port, and provides a simple function (serve) that has an API // to invoke a function for a specific path. import { json, serve, validateRequest, } from "https://deno.land/x/sift@0.6.0/mod.ts"; // TweetNaCl is a cryptography library that we use to verify requests // from Discord. import nacl from "https://esm.sh/tweetnacl@v1.0.3?dts"; // For all requests to "/" endpoint, we want to invoke home() handler. serve({ "/": home, }); // The main logic of the Discord Slash Command is defined in this function. async function home(request: Request) { // validateRequest() ensures that a request is of POST method and // has the following headers. const { error } = await validateRequest(request, { POST: { headers: ["X-Signature-Ed25519", "X-Signature-Timestamp"], }, }); if (error) { return json({ error: error.message }, { status: error.status }); } // verifySignature() verifies if the request is coming from Discord. // When the request's signature is not valid, we return a 401 and this is // important as Discord sends invalid requests to test our verification. const { valid, body } = await verifySignature(request); if (!valid) { return json( { error: "Invalid request" }, { status: 401, }, ); } const { type = 0, data = { options: [] } } = JSON.parse(body); // Discord performs Ping interactions to test our application. // Type 1 in a request implies a Ping interaction. if (type === 1) { return json({ type: 1, // Type 1 in a response is a Pong interaction response type. }); } // Type 2 in a request is an ApplicationCommand interaction. // It implies that a user has issued a command. if (type === 2) { const { value } = data.options.find((option) => option.name === "name"); return json({ // Type 4 responds with the below message retaining the user's // input at the top. type: 4, data: { content: `Hello, ${value}!`, }, }); } // We will return a bad request error as a valid Discord request // shouldn't reach here. return json({ error: "bad request" }, { status: 400 }); } /** Verify whether the request is coming from Discord. */ async function verifySignature( request: Request, ): Promise<{ valid: boolean; body: string }> { const PUBLIC_KEY = Deno.env.get("DISCORD_PUBLIC_KEY")!; // Discord sends these headers with every request. const signature = request.headers.get("X-Signature-Ed25519")!; const timestamp = request.headers.get("X-Signature-Timestamp")!; const body = await request.text(); const valid = nacl.sign.detached.verify( new TextEncoder().encode(timestamp + body), hexToUint8Array(signature), hexToUint8Array(PUBLIC_KEY), ); return { valid, body }; } /** Converts a hexadecimal string to Uint8Array. */ function hexToUint8Array(hex: string) { return new Uint8Array( hex.match(/.{1,2}/g)!.map((val) => parseInt(val, 16)), ); } ``` 5. Click **Save & Deploy** to deploy the server 6. Note the project URL once the file has been deployed. It will be on the upper right hand side of the editor, and end in `.deno.dev`. ## **Step 3:** Configure Discord application to use our URL as interactions endpoint URL 1. Go back to your application (Greeter) page on Discord Developer Portal 2. Fill **INTERACTIONS ENDPOINT URL** field with the Deno Deploy project URL from above and click on **Save Changes**. The application is now ready. Let's proceed to the next section to install it. ## **Step 4:** Install the Slash Command on your Discord server So to use the `hello` Slash Command, we need to install our Greeter application on our Discord server. Here are the steps: 1. Go to **OAuth2** section of the Discord application page on Discord Developer Portal 2. Select `applications.commands` scope and click on the **Copy** button below. 3. Now paste and visit the URL on your browser. Select your server and click on **Authorize**. Open Discord, type `/hello Deno Deploy` and press **Enter**. The output will look something like below. ![Hello, Deno Deploy!](../docs-images/discord-slash-command.png) Congratulations for completing the tutorial! Go ahead and build some awesome Discord Slash Commands! And do share them with us on **deploy** channel of [the Deno Discord server](https://discord.gg/deno). --- # Basic Fresh site URL: https://docs.deno.com/deploy/tutorials/fresh This tutorial will cover how to deploy a Fresh application on Deno Deploy. Fresh is a web framework built for Deno, akin to Express for Node. ## **Step 1:** Create Fresh application ```sh deno run -A -r https://fresh.deno.dev fresh-site ``` To run this application locally: ```sh deno task start ``` You can edit `routes/index.js` to modify the application. ## **Step 2:** Create a new Github repo and link your local Fresh application. 1. Create a new Github repo and record the git repo remote URL 2. From your local `fresh-site`, initialize git and push to the new remote repo: ```sh git init git add . git commit -m "First commit" git remote add origin git push origin main ``` ## **Step 3:** Deploy to Deno Deploy 1. Navigate to [https://dash.deno.com/new_project](https://dash.deno.com/new_project). 2. Connect to your GitHub account and select your repository. 3. Fill in the values on the form: - Give your project a name - Select `Fresh` from the "Framework Preset" options - Set production branch to `main` - Select `main.ts` as the entrypoint file 4. Click "Deploy Project" to kick off Deno Deploy. 5. Once deployed, you can view your new project at url provided in your project dashboard. --- # Tutorials URL: https://docs.deno.com/deploy/tutorials/ Here, you'll find a collection of tutorials and example applications for Deno Deploy. Check our ever expanding list of tutorials in the nav, and explore [examples.deno.land](https://examples.deno.land) for even more. ## Code examples - [Build a simple API server](./simple-api.md) - [Serve static assets](./static-site.md) ## App building tutorials - [Build a Fresh app](./fresh.md) - [Build a Discord slash command](./discord-slash.md) - [Build a site with Vite](./vite.md) --- # Simple API server URL: https://docs.deno.com/deploy/tutorials/simple-api Deno is great for creating simple, light-weight API servers. Learn how to create and deploy one using Deno Deploy in this tutorial. ## Create a local API server In your terminal, create a file named `server.ts`. ```shell touch server.ts ``` We'll implement a simple link shortener service using a [Deno KV database](/deploy/kv/manual). ```ts title="server.ts" const kv = await Deno.openKv(); Deno.serve(async (request: Request) => { // Create short links if (request.method == "POST") { const body = await request.text(); const { slug, url } = JSON.parse(body); const result = await kv.set(["links", slug], url); return new Response(JSON.stringify(result)); } // Redirect short links const slug = request.url.split("/").pop() || ""; const url = (await kv.get(["links", slug])).value as string; if (url) { return Response.redirect(url, 301); } else { const m = !slug ? "Please provide a slug." : `Slug "${slug}" not found`; return new Response(m, { status: 404 }); } }); ``` You can run this server on your machine with this command: ```shell deno run -A --unstable-kv server.ts ``` This server will respond to HTTP `GET` and `POST` requests. The `POST` handler expects to receive a JSON document in request the body with `slug` and `url` properties. The `slug` is the short URL component, and the `url` is the full URL you want to redirect to. Here's an example of using this API endpoint with cURL: ```shell curl --header "Content-Type: application/json" \ --request POST \ --data '{"url":"https://docs.deno.com/runtime/manual","slug":"denodocs"}' \ http://localhost:8000/ ``` In response, the server should send you JSON with the KV data representing the result of the `set` operation: ```json { "ok": true, "versionstamp": "00000000000000060000" } ``` A `GET` request to our server will take a URL slug as a path parameter, and redirect to the provided URL. You can visit this URL in the browser, or make another cURL request to see this in action! ```shell curl -v http://localhost:8000/denodocs ``` Now that we have an API server, let's push it to a GitHub repository that we'll later link to Deno Deploy. ## Create a GitHub repository for your app Sign in to [GitHub](https://github.com) and [create a new repository](https://docs.github.com/en/get-started/quickstart/create-a-repo). You can skip adding a README or any other files for now - a blank repo will do fine for our purposes. In the folder where you created your API server, initialize a local git repo with these commands in sequence. Be sure to swap out `your_username` and `your_repo_name` with the appropriate values. ```sh echo "# My Deno Link Shortener" >> README.md git init git add . git commit -m "first commit" git branch -M main git remote add origin https://github.com/your_username/your_repo_name.git git push -u origin main ``` You should now have a GitHub repository with your `server.ts` file in it, as in [this example repository](https://github.com/kwhinnery/simple_api_server). Now you're ready to import and run this application on Deno Deploy. ## Import and deploy your project Next, sign up for an account on [Deno Deploy](https://dash.deno.com) and [create a new project](https://dash.deno.com/new_project). Connect your GitHub account and select the repository we created a moment ago. ![Deno Deploy project selection](./images/simple_api_deploy.png) The configuration should look something like this: ![Deno Deploy config](./images/simple_api_deploy_settings.png) Click on the "Deploy Project" button. Once deployed, your link shortener service will be live on Deno Deploy! ![Deno Deploy dashboard](./images/simple_api_dashboard.png) ## Test out your new link shortener Without any additional configuration (Deno KV just works on Deploy), your app should run the same as it did on your local machine. You can add new links using the `POST` handler as you did before. Just replace the `localhost` URL with your live production URL on Deno Deploy: ```shell curl --header "Content-Type: application/json" \ --request POST \ --data '{"url":"https://docs.deno.com/runtime/","slug":"denodocs"}' \ https://your-deno-project-url-here.deno.dev/ ``` Similarly, you can visit your shortened URLs in the browser, or view the redirect coming back with a cURL command: ```shell curl -v https://your-deno-project-url-here.deno.dev/denodocs ``` If you enjoyed this project, next you could check out a higher-level web framework like [Fresh](https://fresh.deno.dev), or learn more about [Deno KV here](/deploy/kv/manual). Great work deploying your simple API server! --- # Deploy a static site URL: https://docs.deno.com/deploy/tutorials/static-site This tutorial will cover how to deploy a static site (no JavaScript) on Deno Deploy. ## Step 1: Create the static site ```sh mkdir static-site cd static-site touch index.html ``` Inside your `index.html`, paste the following html: ```html Hello

Hello

image ``` Make sure that there a `image.png` inside `static-site`. You have now a html page that says "Hello" and has a logo. ## Step 2: Deploy the static site using `deployctl` To deploy this repo on Deno Deploy, from the `static-site` repository, run: ```console deployctl deploy --project= --entrypoint=jsr:@std/http/file-server ``` To give a little more explanation of these commands: Because this is a static site, there is no JavaScript to execute. Instead of giving Deno Deploy a particular JavaScript or TypeScript file to run as the entrypoint file, you give it this external `file_server.ts` program, which simply uploads all the static files in the `static-site` repo, including the image and the html page, to Deno Deploy. These static assets are then served up. ## Step 3: Voila! Your static site should now be live! Its url will be output in the terminal, or you can manage your new static site project in your [Deno dashboard](https://dash.deno.com/projects/). If you click through to your new project you will be able to view the site, configure its name, environment variables, custom domains and more. --- # Build a blog with Fresh URL: https://docs.deno.com/deploy/tutorials/tutorial-blog-fresh Tutorial [here](https://deno.com/blog/build-a-blog-with-fresh). --- # API server with DynamoDB URL: https://docs.deno.com/deploy/tutorials/tutorial-dynamodb In this tutorial let's take a look at how we can use it to build a small API that has endpoints to insert and retrieve information, backed by DynamoDB. The tutorial assumes that you have an AWS and Deno Deploy account. - [Overview](#overview) - [Setup DynamoDB](#setup-dynamodb) - [Create a Project in Deno Deploy](#create-a-project-in-deno-deploy) - [Write the Application](#write-the-application) - [Deploy the Application](#deploy-the-application) ## Overview We're going to build an API with a single endpoint that accepts GET/POST requests and returns appropriate information ```sh # A GET request to the endpoint should return the details of the song based on its title. GET /songs?title=Song%20Title # '%20' == space # response { title: "Song Title" artist: "Someone" album: "Something", released: "1970", genres: "country rap", } # A POST request to the endpoint should insert the song details. POST /songs # post request body { title: "A New Title" artist: "Someone New" album: "Something New", released: "2020", genres: "country rap", } ``` ## Setup DynamoDB Our first step in the process is to generate AWS credentials to programmatically access DynamoDB. Generate Credentials: 1. Go to https://console.aws.amazon.com/iam/ and go to "Users" section. 2. Click on **Create user** button, fill the **User name** field (maybe use `denamo`) and select **Programmatic access** type. 3. Click **Next** 4. Select **Attach policies directly** and search for `AmazonDynamoDBFullAccess`. Check the box next to this policy in the results. 5. Click **Next** and **Create user** 6. On the resulting **Users** page, click through to the user you just created 7. Click on **Create access key** 8. Select **Application running outside AWS** 9. Click ***Create** 10. Click **Download .csv file** to download the credentials you just created. Create database table: 1. Go to https://console.aws.amazon.com/dynamodb and click on **Create table** button. 2. Fill the **Table name** field with `songs` and **Partition key** with `title`. 3. Scroll down and click on **Create table**. 4. Once the table is created, click on the table name and find its **General information** 5. Under **Amazon Resource Name (ARN)** take note of the region of your new table (for example us-east-1). ## Write the Application Create a file called `index.js` and insert the following: ```js import { json, serve, validateRequest, } from "https://deno.land/x/sift@0.6.0/mod.ts"; // AWS has an official SDK that works with browsers. As most Deno Deploy's // APIs are similar to browser's, the same SDK works with Deno Deploy. // So we import the SDK along with some classes required to insert and // retrieve data. import { DynamoDBClient, GetItemCommand, PutItemCommand, } from "https://esm.sh/@aws-sdk/client-dynamodb"; // Create a client instance by providing your region information. // The credentials are obtained from environment variables which // we set during our project creation step on Deno Deploy. const client = new DynamoDBClient({ region: Deno.env.get("AWS_TABLE_REGION"), credentials: { accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID"), secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY"), }, }); serve({ "/songs": handleRequest, }); async function handleRequest(request) { // The endpoint allows GET and POST request. A parameter named "title" // for a GET request to be processed. And body with the fields defined // below are required to process POST request. // validateRequest ensures that the provided terms are met by the request. const { error, body } = await validateRequest(request, { GET: { params: ["title"], }, POST: { body: ["title", "artist", "album", "released", "genres"], }, }); if (error) { return json({ error: error.message }, { status: error.status }); } // Handle POST request. if (request.method === "POST") { try { // When we want to interact with DynamoDB, we send a command using the client // instance. Here we are sending a PutItemCommand to insert the data from the // request. const { $metadata: { httpStatusCode }, } = await client.send( new PutItemCommand({ TableName: "songs", Item: { // Here 'S' implies that the value is of type string // and 'N' implies a number. title: { S: body.title }, artist: { S: body.artist }, album: { S: body.album }, released: { N: body.released }, genres: { S: body.genres }, }, }), ); // On a successful put item request, dynamo returns a 200 status code (weird). // So we test status code to verify if the data has been inserted and respond // with the data provided by the request as a confirmation. if (httpStatusCode === 200) { return json({ ...body }, { status: 201 }); } } catch (error) { // If something goes wrong while making the request, we log // the error for our reference. console.log(error); } // If the execution reaches here it implies that the insertion wasn't successful. return json({ error: "couldn't insert data" }, { status: 500 }); } // Handle GET request. try { // We grab the title form the request and send a GetItemCommand // to retrieve the information about the song. const { searchParams } = new URL(request.url); const { Item } = await client.send( new GetItemCommand({ TableName: "songs", Key: { title: { S: searchParams.get("title") }, }, }), ); // The Item property contains all the data, so if it's not undefined, // we proceed to returning the information about the title if (Item) { return json({ title: Item.title.S, artist: Item.artist.S, album: Item.album.S, released: Item.released.S, genres: Item.genres.S, }); } } catch (error) { console.log(error); } // We might reach here if an error is thrown during the request to database // or if the Item is not found in the database. // We reflect both conditions with a general message. return json( { message: "couldn't find the title", }, { status: 404 }, ); } ``` Initialize git in your new project and [push it to GitHub](https://docs.github.com/en/get-started/start-your-journey/hello-world#step-1-create-a-repository). ## Deploy the Application Now that we have everything in place, let's deploy your new application! 1. In your browser, visit [Deno Deploy](https://dash.deno.com/new_project) and link your GitHub account. 2. Select the repository which contains your new application. 3. You can give your project a name or allow Deno to generate one for you 4. Select `index.js` in the Entrypoint dropdown 5. Click **Deploy Project** In order for your Application to work, we will need to configure its environment variables. On your project's success page, or in your project dashboard, click on **Add environmental variables**. Under Environment Variables, click **+ Add Variable**. Create the following variables: 1. `AWS_ACCESS_KEY_ID` - with the value from the CSV you downloaded 2. `AWS_SECRET_ACCESS_KEY` - with the value from the CSV you downloaded. 3. `AWS_TABLE_REGION` - with your table's region Click to save the variables. Let's test the API. POST some data. ```sh curl --request POST --data \ '{"title": "Old Town Road", "artist": "Lil Nas X", "album": "7", "released": "2019", "genres": "Country rap, Pop"}' \ --dump-header - https://.deno.dev/songs ``` GET information about the title. ```sh curl https://.deno.dev/songs?title=Old%20Town%20Road ``` Congratulations on learning how to use DynamoDB with Deno Deploy! --- # API server with FaunaDB URL: https://docs.deno.com/deploy/tutorials/tutorial-faunadb FaunaDB calls itself "The data API for modern applications". It's a database with a GraphQL interface that enables you to use GraphQL to interact with it. Since we communicate with it using HTTP requests, we don't need to manage connections which suits very well for serverless applications. The tutorial assumes that you have [FaunaDB](https://fauna.com) and Deno Deploy accounts, Deno Deploy CLI installed, and some basic knowledge of GraphQL. - [Overview](#overview) - [Build the API Endpoints](#build-the-api-endpoints) - [Use FaunaDB for Persistence](#use-faunadb-for-persistence) - [Deploy the API](#deploy-the-api) ## Overview In this tutorial, let's build a small quotes API with endpoints to insert and retrieve quotes. And later use FaunaDB to persist the quotes. Let's start by defining the API endpoints. ```sh # A POST request to the endpoint should insert the quote to the list. POST /quotes/ # Body of the request. { "quote": "Don't judge each day by the harvest you reap but by the seeds that you plant.", "author": "Robert Louis Stevenson" } # A GET request to the endpoint should return all the quotes from the database. GET /quotes/ # Response of the request. { "quotes": [ { "quote": "Don't judge each day by the harvest you reap but by the seeds that you plant.", "author": "Robert Louis Stevenson" } ] } ``` Now that we understand how the endpoint should behave, let's proceed to build it. ## Build the API Endpoints First, create a file named `quotes.ts` and paste the following content. Read through the comments in the code to understand what's happening. ```ts import { json, serve, validateRequest, } from "https://deno.land/x/sift@0.6.0/mod.ts"; serve({ "/quotes": handleQuotes, }); // To get started, let's just use a global array of quotes. const quotes = [ { quote: "Those who can imagine anything, can create the impossible.", author: "Alan Turing", }, { quote: "Any sufficiently advanced technology is equivalent to magic.", author: "Arthur C. Clarke", }, ]; async function handleQuotes(request: Request) { // Make sure the request is a GET request. const { error } = await validateRequest(request, { GET: {}, }); // validateRequest populates the error if the request doesn't meet // the schema we defined. if (error) { return json({ error: error.message }, { status: error.status }); } // Return all the quotes. return json({ quotes }); } ``` Run the above program using [the Deno CLI](https://deno.land). ```sh deno run --allow-net=:8000 ./path/to/quotes.ts # Listening on http://0.0.0.0:8000/ ``` And curl the endpoint to see some quotes. ```sh curl http://127.0.0.1:8000/quotes # {"quotes":[ # {"quote":"Those who can imagine anything, can create the impossible.", "author":"Alan Turing"}, # {"quote":"Any sufficiently advanced technology is equivalent to magic.","author":"Arthur C. Clarke"} # ]} ``` Let's proceed to handle the POST request. Update the `validateRequest` function to make sure a POST request follows the provided body scheme. ```diff - const { error } = await validateRequest(request, { + const { error, body } = await validateRequest(request, { GET: {}, + POST: { + body: ["quote", "author"] + } }); ``` Handle the POST request by updating `handleQuotes` function with the following code. ```diff async function handleQuotes(request: Request) { const { error, body } = await validateRequest(request, { GET: {}, POST: { body: ["quote", "author"], }, }); if (error) { return json({ error: error.message }, { status: error.status }); } + // Handle POST requests. + if (request.method === "POST") { + const { quote, author } = body as { quote: string; author: string }; + quotes.push({ quote, author }); + return json({ quote, author }, { status: 201 }); + } return json({ quotes }); } ``` Let's test it by inserting some data. ```sh curl --dump-header - --request POST --data '{"quote": "A program that has not been tested does not work.", "author": "Bjarne Stroustrup"}' http://127.0.0.1:8000/quotes ``` The output might look like something below. ```console HTTP/1.1 201 Created transfer-encoding: chunked content-type: application/json; charset=utf-8 {"quote":"A program that has not been tested does not work.","author":"Bjarne Stroustrup"} ``` Awesome! We built our API endpoint, and it's working as expected. Since the data is stored in memory, it will be lost after a restart. Let's use FaunaDB to persist our quotes. ## Use FaunaDB for Persistence Let's define our database schema using GraphQL Schema. ```gql # We're creating a new type named `Quote` to represent a quote and its author. type Quote { quote: String! author: String! } type Query { # A new field in the Query operation to retrieve all quotes. allQuotes: [Quote!] } ``` Fauna has a graphql endpoint for its database, and it generates essential mutations like create, update, delete for a data type defined in the schema. For example, fauna will generate a mutation named `createQuote` to create a new quote in the database for the data type `Quote`. And we're additionally defining a query field named `allQuotes` that returns all the quotes in the database. Let's get to writing the code to interact with fauna from Deno Deploy applications. To interact with fauna, we need to make a POST request to its graphql endpoint with appropriate query and parameters to get the data in return. So let's construct a generic function that will handle those things. ```typescript async function queryFauna( query: string, variables: { [key: string]: unknown }, ): Promise<{ data?: any; error?: any; }> { // Grab the secret from the environment. const token = Deno.env.get("FAUNA_SECRET"); if (!token) { throw new Error("environment variable FAUNA_SECRET not set"); } try { // Make a POST request to fauna's graphql endpoint with body being // the query and its variables. const res = await fetch("https://graphql.fauna.com/graphql", { method: "POST", headers: { authorization: `Bearer ${token}`, "content-type": "application/json", }, body: JSON.stringify({ query, variables, }), }); const { data, errors } = await res.json(); if (errors) { // Return the first error if there are any. return { data, error: errors[0] }; } return { data }; } catch (error) { return { error }; } } ``` Add this code to the `quotes.ts` file. Now let's proceed to update the endpoint to use fauna. ```diff async function handleQuotes(request: Request) { const { error, body } = await validateRequest(request, { GET: {}, POST: { body: ["quote", "author"], }, }); if (error) { return json({ error: error.message }, { status: error.status }); } if (request.method === "POST") { + const { quote, author, error } = await createQuote( + body as { quote: string; author: string } + ); + if (error) { + return json({ error: "couldn't create the quote" }, { status: 500 }); + } return json({ quote, author }, { status: 201 }); } return json({ quotes }); } +async function createQuote({ + quote, + author, +}: { + quote: string; + author: string; +}): Promise<{ quote?: string; author?: string; error?: string }> { + const query = ` + mutation($quote: String!, $author: String!) { + createQuote(data: { quote: $quote, author: $author }) { + quote + author + } + } + `; + + const { data, error } = await queryFauna(query, { quote, author }); + if (error) { + return { error }; + } + + return data; +} ``` Now that we've updated the code to insert new quotes let's set up a fauna database before proceeding to test the code. Create a new database: 1. Go to https://dashboard.fauna.com (login if required) and click on **New Database** 2. Fill the **Database Name** field and click on **Save**. 3. Click on **GraphQL** section visible on the left sidebar. 4. Create a file ending with `.gql` extension with the content being the schema we defined above. Generate a secret to access the database: 1. Click on **Security** section and click on **New Key**. 2. Select **Server** role and click on **Save**. Copy the secret. Let's now run the application with the secret. ```sh FAUNA_SECRET= deno run --allow-net=:8000 --watch quotes.ts # Listening on http://0.0.0.0:8000 ``` ```sh curl --dump-header - --request POST --data '{"quote": "A program that has not been tested does not work.", "author": "Bjarne Stroustrup"}' http://127.0.0.1:8000/quotes ``` Notice how the quote was added to your collection in FaunaDB. Let's write a new function to get all the quotes. ```ts async function getAllQuotes() { const query = ` query { allQuotes { data { quote author } } } `; const { data: { allQuotes: { data: quotes }, }, error, } = await queryFauna(query, {}); if (error) { return { error }; } return { quotes }; } ``` And update the `handleQuotes` function with the following code. ```diff -// To get started, let's just use a global array of quotes. -const quotes = [ - { - quote: "Those who can imagine anything, can create the impossible.", - author: "Alan Turing", - }, - { - quote: "Any sufficiently advanced technology is equivalent to magic.", - author: "Arthur C. Clarke", - }, -]; async function handleQuotes(request: Request) { const { error, body } = await validateRequest(request, { GET: {}, POST: { body: ["quote", "author"], }, }); if (error) { return json({ error: error.message }, { status: error.status }); } if (request.method === "POST") { const { quote, author, error } = await createQuote( body as { quote: string; author: string }, ); if (error) { return json({ error: "couldn't create the quote" }, { status: 500 }); } return json({ quote, author }, { status: 201 }); } + // It's assumed that the request method is "GET". + { + const { quotes, error } = await getAllQuotes(); + if (error) { + return json({ error: "couldn't fetch the quotes" }, { status: 500 }); + } + + return json({ quotes }); + } } ``` ```sh curl http://127.0.0.1:8000/quotes ``` You should see all the quotes we've inserted into the database. The final code of the API is available at https://deno.com/examples/fauna.ts. ## Deploy the API Now that we have everything in place, let's deploy your new API! 1. In your browser, visit [Deno Deploy](https://dash.deno.com/new_project) and link your GitHub account. 2. Select the repository which contains your new API. 3. You can give your project a name or allow Deno to generate one for you 4. Select `index.ts` in the Entrypoint dropdown 5. Click **Deploy Project** In order for your Application to work, we will need to configure its environment variables. On your project's success page, or in your project dashboard, click on **Add environmental variables**. Under Environment Variables, click **+ Add Variable**. Create a new variable called `FAUNA_SECRET` - The value should be the secret we created earlier. Click to save the variables. On your project overview, click **View** to view the project in your browser, add `/quotes` to the end of the url to see the content of your FaunaDB. --- # API server with Firestore (Firebase) URL: https://docs.deno.com/deploy/tutorials/tutorial-firebase Firebase is a platform developed by Google for creating mobile and web applications. You can persist data on the platform using Firestore. In this tutorial let's take a look at how we can use it to build a small API that has endpoints to insert and retrieve information. - [Overview](#overview) - [Concepts](#concepts) - [Setup Firebase](#setup-firebase) - [Write the application](#write-the-application) - [Deploy the application](#deploy-the-application) ## Overview We are going to build an API with a single endpoint that accepts `GET` and `POST` requests and returns a JSON payload of information: ```sh # A GET request to the endpoint without any sub-path should return the details # of all songs in the store: GET /songs # response [ { title: "Song Title", artist: "Someone", album: "Something", released: "1970", genres: "country rap", } ] # A GET request to the endpoint with a sub-path to the title should return the # details of the song based on its title. GET /songs/Song%20Title # '%20' == space # response { title: "Song Title" artist: "Someone" album: "Something", released: "1970", genres: "country rap", } # A POST request to the endpoint should insert the song details. POST /songs # post request body { title: "A New Title" artist: "Someone New" album: "Something New", released: "2020", genres: "country rap", } ``` In this tutorial, we will be: - Creating and setting up a [Firebase Project](https://console.firebase.google.com/). - Using a text editor to create our application. - Creating a [gist](https://gist.github.com/) to "host" our application. - Deploying our application on [Deno Deploy](https://dash.deno.com/). - Testing our application using [cURL](https://curl.se/). ## Concepts There are a few concepts that help in understanding why we take a particular approach in the rest of the tutorial, and can help in extending the application. You can skip ahead to [Setup Firebase](#setup-firebase) if you want. ### Deploy is browser-like Even though Deploy runs in the cloud, in many aspects the APIs it provides are based on web standards. So when using Firebase, the Firebase APIs are more compatible with the web than those that are designed for server run times. That means we will be using the Firebase web libraries in this tutorial. ### Firebase uses XHR Firebase uses a wrapper around Closure's [WebChannel](https://google.github.io/closure-library/api/goog.net.WebChannel.html) and WebChannel was originally built around [`XMLHttpRequest`](https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest). While WebChannel supports the more modern `fetch()` API, current versions of Firebase for the web do not uniformly instantiate WebChannel with `fetch()` support, and instead use `XMLHttpRequest`. While Deploy is browser-like, it does not support `XMLHttpRequest`. `XMLHttpRequest` is a "legacy" browser API that has several limitations and features that would be difficult to implement in Deploy, which means it is unlikely that Deploy will ever implement that API. So, in this tutorial we will be using a limited _polyfill_ that provides enough of the `XMLHttpRequest` feature set to allow Firebase/WebChannel to communicate with the server. ### Firebase auth Firebase offers quite [a few options](https://firebase.google.com/docs/auth) around authentication. In this tutorial we are going to be using email and password authentication. When a user is logged in, Firebase can persist that authentication. Because we are using the web libraries for Firebase, persisting the authentication allows a user to navigate away from a page and not need to re-log in when returning. Firebase allows authentication to be persisted in local storage, session storage or none. In a Deploy context, it is a little different. A Deploy deployment will remain "active" meaning that in-memory state will be present from request to request on some requests, but under various conditions a new deployment can be started up or shutdown. Currently, Deploy doesn't offer any persistence outside of in-memory allocation. In addition it doesn't currently offer the global `localStorage` or `sessionStorage`, which is what is used by Firebase to store the authentication information. In order to reduce the need to re-authenticate but also ensure that we can support multiple-users with a single deployment, we are going to use a polyfill that will allow us to provide a `localStorage` interface to Firebase, but store the information as a cookie in the client. ## Setup Firebase [Firebase](https://firebase.google.com/) is a feature rich platform. All the details of Firebase administration are beyond the scope of this tutorial. We will cover what it needed for this tutorial. 1. Create a new project under the [Firebase console](https://console.firebase.google.com/). 2. Add a web application to your project. Make note of the `firebaseConfig` provided in the setup wizard. It should look something like the below. We will use this later: ```js title="firebase.js" var firebaseConfig = { apiKey: "APIKEY", authDomain: "example-12345.firebaseapp.com", projectId: "example-12345", storageBucket: "example-12345.appspot.com", messagingSenderId: "1234567890", appId: "APPID", }; ``` 3. Under `Authentication` in the administration console for, you will want to enable the `Email/Password` sign-in method. 4. You will want to add a user and password under `Authentication` and then `Users` section, making note of the values used for later. 5. Add `Firestore Database` to your project. The console will allow you to setup in _production mode_ or _test mode_. It is up to you how you configure this, but _production mode_ will require you to setup further security rules. 6. Add a collection to the database named `songs`. This will require you to add at least one document. Just set the document with an _Auto ID_. _Note_ depending on the status of your Google account, there maybe other setup and administration steps that need to occur. ## Write the application We want to create our application as a JavaScript file in our favorite editor. The first thing we will do is import the `XMLHttpRequest` polyfill that Firebase needs to work under Deploy as well as a polyfill for `localStorage` to allow the Firebase auth to persist logged in users: ```js title="firebase.js" import "https://deno.land/x/xhr@0.1.1/mod.ts"; import { installGlobals } from "https://deno.land/x/virtualstorage@0.1.0/mod.ts"; installGlobals(); ``` > ℹ️ we are using the current version of packages at the time of the writing of > this tutorial. They may not be up-to-date and you may want to double check > current versions. Because Deploy has a lot of the web standard APIs, it is best to use the web libraries for Firebase under deploy. Currently v9 is in still in beta for Firebase, so we will use v8 in this tutorial: ```js title="firebase.js" import firebase from "https://esm.sh/firebase@8.7.0/app"; import "https://esm.sh/firebase@8.7.0/auth"; import "https://esm.sh/firebase@8.7.0/firestore"; ``` We are also going to use [oak](https://deno.land/x/oak) as the middleware framework for creating the APIs, including middleware that will take the `localStorage` values and set them as client cookies: ```js title="firebase.js" import { Application, Router, Status, } from "https://deno.land/x/oak@v7.7.0/mod.ts"; import { virtualStorage } from "https://deno.land/x/virtualstorage@0.1.0/middleware.ts"; ``` Now we need to setup our Firebase application. We will be getting the configuration from environment variables we will setup later under the key `FIREBASE_CONFIG` and get references to the parts of Firebase we are going to use: ```js title="firebase.js" const firebaseConfig = JSON.parse(Deno.env.get("FIREBASE_CONFIG")); const firebaseApp = firebase.initializeApp(firebaseConfig, "example"); const auth = firebase.auth(firebaseApp); const db = firebase.firestore(firebaseApp); ``` We are also going to setup the application to handle signed in users per request. So we will create a map of users that we have previously signed in in this deployment. While in this tutorial we will only ever have one signed in user, the code can easily be adapted to allow clients to sign-in individually: ```js title="firebase.js" const users = new Map(); ``` Let's create our middleware router and create three different middleware handlers to support `GET` and `POST` of `/songs` and a `GET` of a specific song on `/songs/{title}`: ```js title="firebase.js" const router = new Router(); // Returns any songs in the collection router.get("/songs", async (ctx) => { const querySnapshot = await db.collection("songs").get(); ctx.response.body = querySnapshot.docs.map((doc) => doc.data()); ctx.response.type = "json"; }); // Returns the first document that matches the title router.get("/songs/:title", async (ctx) => { const { title } = ctx.params; const querySnapshot = await db.collection("songs").where("title", "==", title) .get(); const song = querySnapshot.docs.map((doc) => doc.data())[0]; if (!song) { ctx.response.status = 404; ctx.response.body = `The song titled "${ctx.params.title}" was not found.`; ctx.response.type = "text"; } else { ctx.response.body = querySnapshot.docs.map((doc) => doc.data())[0]; ctx.response.type = "json"; } }); function isSong(value) { return typeof value === "object" && value !== null && "title" in value; } // Removes any songs with the same title and adds the new song router.post("/songs", async (ctx) => { const body = ctx.request.body(); if (body.type !== "json") { ctx.throw(Status.BadRequest, "Must be a JSON document"); } const song = await body.value; if (!isSong(song)) { ctx.throw(Status.BadRequest, "Payload was not well formed"); } const querySnapshot = await db .collection("songs") .where("title", "==", song.title) .get(); await Promise.all(querySnapshot.docs.map((doc) => doc.ref.delete())); const songsRef = db.collection("songs"); await songsRef.add(song); ctx.response.status = Status.NoContent; }); ``` Ok, we are almost done. We just need to create our middleware application, and add the `localStorage` middleware we imported: ```js title="firebase.js" const app = new Application(); app.use(virtualStorage()); ``` And then we need to add middleware to authenticate the user. In this tutorial we are simply grabbing the username and password from the environment variables we will be setting up, but this could easily be adapted to redirect a user to a sign-in page if they are not logged in: ```js title="firebase.js" app.use(async (ctx, next) => { const signedInUid = ctx.cookies.get("LOGGED_IN_UID"); const signedInUser = signedInUid != null ? users.get(signedInUid) : undefined; if (!signedInUid || !signedInUser || !auth.currentUser) { const creds = await auth.signInWithEmailAndPassword( Deno.env.get("FIREBASE_USERNAME"), Deno.env.get("FIREBASE_PASSWORD"), ); const { user } = creds; if (user) { users.set(user.uid, user); ctx.cookies.set("LOGGED_IN_UID", user.uid); } else if (signedInUser && signedInUid.uid !== auth.currentUser?.uid) { await auth.updateCurrentUser(signedInUser); } } return next(); }); ``` Now let's add our router to the middleware application and set the application to listen on port 8000: ```js title="firebase.js" app.use(router.routes()); app.use(router.allowedMethods()); await app.listen({ port: 8000 }); ``` Now we have an application that should serve up our APIs. ## Deploy the Application Now that we have everything in place, let's deploy your new application! 1. In your browser, visit [Deno Deploy](https://dash.deno.com/new_project) and link your GitHub account. 2. Select the repository which contains your new application. 3. You can give your project a name or allow Deno to generate one for you 4. Select `firebase.js` in the Entrypoint dropdown 5. Click **Deploy Project** In order for your Application to work, we will need to configure its environment variables. On your project's success page, or in your project dashboard, click on **Add environmental variables**. Under Environment Variables, click **+ Add Variable**. Create the following variables: 1. `FIREBASE_USERNAME` - The Firebase user (email address) that was added above 2. `FIREBASE_PASSWORD` - The Firebase user password that was added above 3. `FIREBASE_CONFIG` - The configuration of the Firebase application as a string of JSON The configuration needs to be a valid JSON string to be readable by the application. If the code snippet given when setting up looked like this: ```js var firebaseConfig = { apiKey: "APIKEY", authDomain: "example-12345.firebaseapp.com", projectId: "example-12345", storageBucket: "example-12345.appspot.com", messagingSenderId: "1234567890", appId: "APPID", }; ``` You would need to set the value of the string to this (noting that spacing and new lines are not required): ```json { "apiKey": "APIKEY", "authDomain": "example-12345.firebaseapp.com", "projectId": "example-12345", "storageBucket": "example-12345.appspot.com", "messagingSenderId": "1234567890", "appId": "APPID" } ``` Click to save the variables. Now let's take our API for a spin. We can create a new song: ```sh curl --request POST \ --header "Content-Type: application/json" \ --data '{"title": "Old Town Road", "artist": "Lil Nas X", "album": "7", "released": "2019", "genres": "Country rap, Pop"}' \ --dump-header \ - https://.deno.dev/songs ``` And we can get all the songs in our collection: ```sh curl https://.deno.dev/songs ``` And we get specific information about a title we created: ```sh curl https://.deno.dev/songs/Old%20Town%20Road ``` --- # Simple HTTP server URL: https://docs.deno.com/deploy/tutorials/tutorial-http-server In this tutorial, let's build a HTTP server that responds to all incoming HTTP requests with `Hello, world!` and a `200 OK` HTTP status. We will be using the Deno Deploy playground to deploy and edit this script. ## Step 1: Write the HTTP server script A simple HTTP server can be written with a single line of code in Deno using [`Deno.serve`](https://docs.deno.com/api/deno/~/Deno.serve): ```js title="One-line HTTP server" Deno.serve(() => new Response("Hello, world!")); ``` While this type of server is useful for getting started, `Deno.serve` is capable of supporting more advanced usage as well ([API reference docs](https://docs.deno.com/api/deno/~/Deno.serve)). Below is an example of a more complex server that takes advantage of other API features. ```ts title="More complex Hello World server" Deno.serve({ onListen: ({ port }) => { console.log("Deno server listening on *:", port); }, }, (req: Request, conn: Deno.ServeHandlerInfo) => { // Get information about the incoming request const method = req.method; const ip = conn.remoteAddr.hostname; console.log(`${ip} just made an HTTP ${method} request.`); // Return a web standard Response object return new Response("Hello, world!"); }); ``` ## Step 2: Deploy script to Deno Deploy 1. Create a new playground project by visiting [your Deno dashboard](https://dash.deno.com/account/overview), and clicking the **New Playground** button. 2. On the next screen, copy the code above (either the short or the longer example) into the editor on the left side of the screen. 3. Press the **Save & Deploy** button on the right side of the top toolbar (or press Ctrl+S). You can preview the result on the right side of the playground editor, in the preview pane. You will see that if you change the script (for example `Hello, World!` -> `Hello, Galaxy!`) and then re-deploy, the preview will automatically update. The URL shown at the top of the preview pane can be used to visit the deployed page from anywhere. Even in the playground editor, scripts are deployed worldwide across our entire global network. This guarantees fast and reliable performance, no matter the location of your users. --- # Build a blog with Hugo URL: https://docs.deno.com/deploy/tutorials/tutorial-hugo-blog Tutorial [here](https://deno.com/blog/hugo-blog-with-deno-deploy). --- # API server with Postgres URL: https://docs.deno.com/deploy/tutorials/tutorial-postgres Postgres is a popular database for web applications because of its flexibility and ease of use. This guide will show you how to use Deno Deploy with Postgres. - [API server with Postgres](#api-server-with-postgres) - [Overview](#overview) - [Setup Postgres](#setup-postgres) - [Neon Postgres](#neon-postgres) - [Supabase](#supabase) - [Write and deploy the application](#write-and-deploy-the-application) ## Overview We are going to build the API for a simple todo list application. It will have two endpoints: `GET /todos` will return a list of all todos, and `POST /todos` will create a new todo. ``` GET /todos --- title: "returns a list of all todos" --- [ { "id": 1, "title": "Buy bread" }, { "id": 2, "title": "Buy rice" }, { "id": 3, "title": "Buy spices" } ] POST /todos --- title: "creates a new todo" --- "Buy milk" --- title: "returns a 201 status code" --- ``` In this tutorial, we will be: - Creating and setting up a [Postgres](https://www.postgresql.org/) instance on [Neon Postgres](https://neon.tech/) or [Supabase](https://supabase.com). - Using a [Deno Deploy](../manual/deployctl.md) Playground to develop and deploy the application. - Testing our application using [cURL](https://curl.se/). ## Setup Postgres > This tutorial will focus entirely on connecting to Postgres unencrypted. If > you would like to use encryption with a custom CA certificate, use the > documentation [here](https://deno-postgres.com/#/?id=ssltls-connection). To get started we need to create a new Postgres instance for us to connect to. For this tutorial, you can use either [Neon Postgres](https://neon.tech/) or [Supabase](https://supabase.com), as they both provide free, managed Postgres instances. If you like to host your database somewhere else, you can do that too. ### Neon Postgres 1. Visit https://neon.tech/ and click **Sign up** to sign up with an email, Github, Google, or partner account. After signing up, you are directed to the Neon Console to create your first project. 2. Enter a name for your project, select a Postgres version, provide a database name, and select a region. Generally, you'll want to select the region closest to your application. When you're finished, click **Create project**. 3. You are presented with the connection string for your new project, which you can use to connect to your database. Save the connection string, which looks something like this: ```sh postgres://alex:AbC123dEf@ep-cool-darkness-123456.us-east-2.aws.neon.tech/dbname?sslmode=require ``` ### Supabase 1. Visit https://app.supabase.io/ and click "New project". 2. Select a name, password, and region for your database. Make sure to save the password, as you will need it later. 3. Click "Create new project". Creating the project can take a while, so be patient. 4. Once the project is created, navigate to the "Database" tab on the left. 5. Go to the "Connection Pooling" settings, and copy the connection string from the "Connection String" field. This is the connection string you will use to connect to your database. Insert the password you saved earlier into this string, and then save the string somewhere - you will need it later. ## Write and deploy the application We can now start writing our application. To start, we will create a new Deno Deploy playground in the control panel: press the "New Playground" button on https://dash.deno.com/projects. This will open up the playground editor. Before we can actually start writing code, we'll need to put our Postgres connection string into the environment variables. To do this, click on the project name in the top left corner of the editor. This will open up the project settings. From here, you can navigate to the "Settings" -> "Environment Variable" tab via the left navigation menu. Enter "DATABASE_URL" into the "Key" field, and paste your connection string into the "Value" field. Now, press "Add". Your environment variables is now set. Let's return back to the editor: to do this, go to the "Overview" tab via the left navigation menu, and press "Open Playground". Let's start by serving HTTP requests using `Deno.serve()`: ```ts Deno.serve(async (req) => { return new Response("Not Found", { status: 404 }); }); ``` You can already save this code using Ctrl+S (or Cmd+S on Mac). You should see the preview page on the right refresh automatically: it now says "Not Found". Next, let's import the Postgres module, read the connection string from the environment variables, and create a connection pool. ```ts import * as postgres from "https://deno.land/x/postgres@v0.14.0/mod.ts"; // Get the connection string from the environment variable "DATABASE_URL" const databaseUrl = Deno.env.get("DATABASE_URL")!; // Create a database pool with three connections that are lazily established const pool = new postgres.Pool(databaseUrl, 3, true); ``` Again, you can save this code now, but this time you should see no changes. We are creating a connection pool, but we are not actually running any queries against the database yet. Before we can do that, we need to set up our table schema. We want to store a list of todos. Let's create a table called `todos` with an auto-increment `id` column and a `title` column: ```ts const pool = new postgres.Pool(databaseUrl, 3, true); // Connect to the database const connection = await pool.connect(); try { // Create the table await connection.queryObject` CREATE TABLE IF NOT EXISTS todos ( id SERIAL PRIMARY KEY, title TEXT NOT NULL ) `; } finally { // Release the connection back into the pool connection.release(); } ``` Now that we have a table, we can add the HTTP handlers for the GET and POST endpoints. ```ts Deno.serve(async (req) => { // Parse the URL and check that the requested endpoint is /todos. If it is // not, return a 404 response. const url = new URL(req.url); if (url.pathname !== "/todos") { return new Response("Not Found", { status: 404 }); } // Grab a connection from the database pool const connection = await pool.connect(); try { switch (req.method) { case "GET": { // This is a GET request. Return a list of all todos. // Run the query const result = await connection.queryObject` SELECT * FROM todos `; // Encode the result as JSON const body = JSON.stringify(result.rows, null, 2); // Return the result as JSON return new Response(body, { headers: { "content-type": "application/json" }, }); } case "POST": { // This is a POST request. Create a new todo. // Parse the request body as JSON. If the request body fails to parse, // is not a string, or is longer than 256 chars, return a 400 response. const title = await req.json().catch(() => null); if (typeof title !== "string" || title.length > 256) { return new Response("Bad Request", { status: 400 }); } // Insert the new todo into the database await connection.queryObject` INSERT INTO todos (title) VALUES (${title}) `; // Return a 201 Created response return new Response("", { status: 201 }); } default: // If this is neither a POST, or a GET return a 405 response. return new Response("Method Not Allowed", { status: 405 }); } } catch (err) { console.error(err); // If an error occurs, return a 500 response return new Response(`Internal Server Error\n\n${err.message}`, { status: 500, }); } finally { // Release the connection back into the pool connection.release(); } }); ``` And there we go - application done. Deploy this code by saving the editor. You can now POST to the `/todos` endpoint to create a new todo, and you can get a list of all todos by making a GET request to `/todos`: ```sh $ curl -X GET https://tutorial-postgres.deno.dev/todos []⏎ $ curl -X POST -d '"Buy milk"' https://tutorial-postgres.deno.dev/todos $ curl -X GET https://tutorial-postgres.deno.dev/todos [ { "id": 1, "title": "Buy milk" } ]⏎ ``` It's all working 🎉 The full code for the tutorial: As an extra challenge, try add a `DELETE /todos/:id` endpoint to delete a todo. The [URLPattern][urlpattern] API can help with this. [urlpattern]: https://developer.mozilla.org/en-US/docs/Web/API/URL_Pattern_API --- # Use WordPress as a headless CMS URL: https://docs.deno.com/deploy/tutorials/tutorial-wordpress-frontend WordPress is the most popular CMS in the world, but is difficult to use in a "headless" form, i.e. with a custom frontend. In this tutorial, we show how to use Fresh, a modern web framework built on Deno, to create a frontend for headless WordPress. ## Step 1: Clone the Fresh WordPress theme Fresh offers two ready-to-go themes, one for a blog and one for shopfront. ### Blog ```bash git clone https://github.com/denoland/fresh-wordpress-themes.git cd fresh-wordpress-themes/blog deno task docker ``` ### Shop ```bash git clone https://github.com/denoland/fresh-wordpress-themes.git cd fresh-wordpress-themes/corporate deno task docker ``` Note that Blog and Shop themes use different setups for WordPress server. Make sure you run `deno task docker` command in the right directory. ## Step 2: Open another terminal in the same directory and run: ```sh deno task start ``` ## Step 3: Visit http://localhost:8000/ You can manage the contents of the site via the WordPress dashboard at http://localhost/wp-admin (username: `user`, password: `password`). ## WordPress hosting options There are a lot of options for hosting WordPress on the internet. Many cloud providers [have](https://aws.amazon.com/getting-started/hands-on/launch-a-wordpress-website/) [special](https://cloud.google.com/wordpress) [guides](https://learn.microsoft.com/en-us/azure/app-service/quickstart-wordpress) and [templates](https://console.cloud.google.com/marketplace/product/click-to-deploy-images/wordpress) dedicated to WordPress. There are also dedicated hosting services for WordPress, such as [Bluehost](https://www.bluehost.com/), [DreamHost](https://www.dreamhost.com/), [SiteGround](https://www.siteground.com/), etc. You can choose which is the best fit for your needs from these options. There are also many resources on the internet about how to scale your WordPress instances. --- # Deploy a React app with Vite URL: https://docs.deno.com/deploy/tutorials/vite This tutorial covers how to deploy a Vite Deno and React app on Deno Deploy. ## Step 1: Create a Vite app Let's use [Vite](https://vitejs.dev/) to quickly scaffold a Deno and React app: ```sh deno run -RWE npm:create-vite-extra@latest ``` We'll name our project `vite-project`. Be sure to select `deno-react` in the project configuration. Then, `cd` into the newly created project folder. ## Step 2: Run the repo locally To see and edit your new project locally you can run: ```sh deno task dev ``` ## Step 3: Deploy your project with Deno Deploy Now that we have everything in place, let's deploy your new project! 1. In your browser, visit [Deno Deploy](https://dash.deno.com/new_project) and link your GitHub account. 2. Select the repository which contains your new Vite project. 3. You can give your project a name or allow Deno to generate one for you. 4. Select **Vite** from the **Framework Preset** dropdown. This will populate the **Entrypoint** form field. 5. Leave the **Install step** empty. 6. Set the **Build step** to `deno task build`. 7. Set the **Root directory** to `dist` 8. Click **Deploy Project** > NB. The entrypoint that is set will be `jsr:@std/http/file-server`. Note that > this is not a file that exists in the Vite repo itself. Instead, it is an > external program. When run, this program uploads all the static asset files in > your current repo (`vite-project/dist`) to Deno Deploy. Then when you navigate > to the deployment URL, it serves up the local directory. ### `deployctl` Alternatively, you can use `deployctl` directly to deploy `vite-project` to Deno Deploy. ```console cd /dist deployctl deploy --project= --entrypoint=jsr:@std/http/file-server ``` --- # How to use Apollo with Deno > Step-by-step tutorial on integrating Apollo GraphQL with Deno. Learn how to set up an Apollo Server, define schemas, implement resolvers, and build a complete GraphQL API using TypeScript. URL: https://docs.deno.com/examples/tutorials/apollo [Apollo Server](https://www.apollographql.com/) is a GraphQL server that you can set up in minutes and use with your existing data source (or REST API). You can then connect any GraphQL client to it to receive the data and take advantage of GraphQL benefits, such as type-checking and efficient fetching. We're going to get a simple Apollo server up and running that will allow us to query some local data. We're only going to need three files for this: 1. `schema.ts` to set up our data model 2. `resolvers.ts` to set up how we're going to populate the data fields in our schema 3. Our `main.ts` where the server is going to launch We'll start by creating them: ```shell touch schema.ts resolvers.ts main.ts ``` Let's go through setting up each. [View source here.](https://github.com/denoland/examples/tree/main/with-apollo) ## schema.ts Our `schema.ts` file describes our data. In this case, our data is a list of dinosaurs. We want our users to be able to get the name and a short description of each dino. In GraphQL language, this means that `Dinosaur` is our **type**, and `name` and `description` are our **fields**. We can also define the data type for each field. In this case, both are strings. This is also where we describe the queries we allow for our data, using the special **Query** type in GraphQL. We have two queries: - `dinosaurs` which gets a list of all dinosaurs - `dinosaur` which takes in the `name` of a dinosaur as an argument and returns information about that one type of dinosaur. We're going to export all this within our `typeDefs` type definitions, variable: ```tsx export const typeDefs = ` type Dinosaur { name: String description: String } type Query { dinosaurs: [Dinosaur] dinosaur(name: String): Dinosaur } `; ``` If we wanted to write data, this is also where we would describe the **Mutation** to do so. Mutations are how you write data with GraphQL. Because we are using a static dataset here, we won't be writing anything. ## resolvers.ts A resolver is responsible for populating the data for each query. Here we have our list of dinosaurs and all the resolver is going to do is either a) pass that entire list to the client if the user requests the `dinosaurs` query, or pass just one if the user requests the `dinosaur` query. ```tsx const dinosaurs = [ { name: "Aardonyx", description: "An early stage in the evolution of sauropods.", }, { name: "Abelisaurus", description: '"Abel\'s lizard" has been reconstructed from a single skull.', }, ]; export const resolvers = { Query: { dinosaurs: () => dinosaurs, dinosaur: (_: any, args: any) => { return dinosaurs.find((dinosaur) => dinosaur.name === args.name); }, }, }; ``` With the latter, we pass the arguments from the client into a function to match the name to a name in our dataset. ## main.ts In our `main.ts` we're going to import the `ApolloServer` as well as `graphql` and our `typeDefs` from the schema and our resolvers: ```tsx import { ApolloServer } from "npm:@apollo/server@^4.1"; import { startStandaloneServer } from "npm:@apollo/server@4.1/standalone"; import { graphql } from "npm:graphql@16.6"; import { typeDefs } from "./schema.ts"; import { resolvers } from "./resolvers.ts"; const server = new ApolloServer({ typeDefs, resolvers, }); const { url } = await startStandaloneServer(server, { listen: { port: 8000 }, }); console.log(`Server running on: ${url}`); ``` We pass our `typeDefs` and `resolvers` to `ApolloServer` to spool up a new server. Finally, `startStandaloneServer` is a helper function to get the server up and running quickly. ## Running the server All that is left to do now is run the server: ```shell deno run --allow-net --allow-read --allow-env main.ts ``` You should see `Server running on: 127.0.0.1:8000` in your terminal. If you go to that address you will see the Apollo sandbox where we can enter our `dinosaurs` query: ```graphql query { dinosaurs { name description } } ``` This will return our dataset: ```graphql { "data": { "dinosaurs": [ { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." }, { "name": "Abelisaurus", "description": "\"Abel's lizard\" has been reconstructed from a single skull." } ] } } ``` Or if we want just one `dinosaur`: ```graphql query { dinosaur(name:"Aardonyx") { name description } } ``` Which returns: ```graphql { "data": { "dinosaur": { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." } } } ``` Awesome! [Learn more about using Apollo and GraphQL in their tutorials](https://www.apollographql.com/tutorials/). --- # Build Astro with Deno > Step-by-step tutorial on building web applications with Astro and Deno. Learn how to scaffold projects, create dynamic pages, implement SSR, and deploy your Astro sites using Deno's Node.js compatibility. URL: https://docs.deno.com/examples/tutorials/astro [Astro](https://astro.build/) is a modern web framework focused on content-centric websites, which leverages islands architecture and sends zero JavaScript to the client by default. And with the recent release of [Deno 2](https://deno.com/2), now [backwards compatible with Node and npm](https://deno.com/blog/v2.0#backwards-compatible-forward-thinking), the experience of using Astro and Deno has improved. We’ll go over how to build a simple Astro project using Deno: - [Scaffold an Astro project](#scaffold-an-astro-project) - [Update index page](#update-index-page-to-list-all-dinosaurs) - [Add a dynamic SSR page](#add-a-dynamic-ssr-page) - [What’s next?](#whats-next) Feel free to skip directly to [the source code](https://github.com/denoland/examples/tree/main/with-astro) or follow along below! ## Scaffold an Astro project Astro provides a CLI tool to quickly scaffold a new Astro project. In your terminal, run the command `deno init --npm astro@latest` to create a new Astro project with Deno. For this tutorial, we’ll select the “Empty” template so we can start from scratch, and skip installing dependencies so we can install them with Deno later: ```jsx deno init --npm astro@latest astro Launch sequence initiated. dir Where should we create your new project? ./dino-app tmpl How would you like to start your new project? Empty ts Do you plan to write TypeScript? Yes use How strict should TypeScript be? Strict deps Install dependencies? No ◼ No problem! Remember to install dependencies after setup. git Initialize a new git repository? Yes ✔ Project initialized! ■ Template copied ■ TypeScript customized ■ Git initialized next Liftoff confirmed. Explore your project! Enter your project directory using cd ./dino-app Run npm run dev to start the dev server. CTRL+C to stop. Add frameworks like react or tailwind using astro add. Stuck? Join us at https://astro.build/chat ╭─────╮ Houston: │ ◠ ◡ ◠ Good luck out there, astronaut! 🚀 ╰──🍫─╯ ``` As of Deno 2, [Deno can also install packages with the new `deno install` command](https://deno.com/blog/v2.0#deno-is-now-a-package-manager-with-deno-install). So let’s run [`deno install`](https://docs.deno.com/runtime/reference/cli/install/) with the flag `--allow-scripts` to execute any npm lifecycle scripts: ```bash deno install --allow-scripts ``` To see what commands we have, let’s run `deno task`: ```bash deno task Available tasks: - dev (package.json) astro dev - start (package.json) astro dev - build (package.json) astro check && astro build - preview (package.json) astro preview - astro (package.json) astro ``` We can start the Astro server with `deno task dev`: ![Getting the Astro app to work](./images/how-to/astro/hello-astro.png) ## Configure the formatter `deno fmt` supports Astro files with the [`--unstable-component`](https://docs.deno.com/runtime/reference/cli/fmt/#formatting-options-unstable-component) flag. To use it, run this command: ```sh deno fmt --unstable-component ``` To configure `deno fmt` to always format your Astro files, add this at the top level of your `deno.json` file: ```json "unstable": ["fmt-component"] ``` ## Update index page to list all dinosaurs Our app will display facts about a variety of dinosaurs. The first page to create will be the index page that lists links to all dinosaurs in our “database”. First, let’s create the data that will be used in the app. In this example, we’ll hardcode the data in a json file, but you can use any data storage in practice. We’ll create a `data` folder in the root of the project, then a `dinosaurs.json` file with [this text](https://github.com/denoland/tutorial-with-react/blob/main/api/data.json) in it. > ⚠️️ In this tutorial we hard code the data. But you can connect to > [a variety of databases](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) > and > [even use ORMs like Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) > with Deno. Once we have the data, let’s create an index page that lists all of the dinosaurs. In the `./src/pages/index.astro` page, let’s write the following: ```jsx --- import data from "../../data/dinosaurs.json"; --- Dinosaurs

Dinosaurs

``` Let’s start the server with `deno task dev` and point our browser to `localhost:4321`: ![Index page that lists all dinosaurs](./images/how-to/astro/index-page.webp) Awesome! But when you click on a dinosaur, it 404’s. Let’s fix that. ## Add a dynamic SSR page Our app will display facts about a variety of dinosaurs. In order to do that, we’ll create a dynamic server-side rendered (”SSR”), which [offers better performance for end users while improving your pages SEO](https://deno.com/blog/the-future-and-past-is-server-side-rendering). Next, let’s create a new file under `/src/pages/` called `[dinosaur].astro`. At the top of the file, we'll add some logic to pull data from our hardcoded data source and filter that against the `dinosaur` parameter set from the URL path. At the bottom, we’ll render the data. Your file should look like this: ```jsx --- import data from "../../data/dinosaurs.json"; const { dinosaur } = Astro.params; const dinosaurObj = data.find((item) => item.name.toLowerCase() === dinosaur); if (!dinosaurObj) return Astro.redirect("/404"); const { name, description } = dinosaurObj; ---

{ name }

{ description }

``` > ⚠️️ The > [Deno language server](https://docs.deno.com/runtime/reference/lsp_integration/) > does not currently support `.astro` files, so you may experience false red > squigglies. We're working on improving this experience. Let’s run it with `deno task dev`, and point our browser to `localhost:4321/abrictosaurus`: ![Rendering a dynamic page for abrictosaurus](./images/how-to/astro/dynamic-page.webp) It works! ## What’s next We hope this tutorial gives you a good idea of how to get started building with Astro and Deno. You can learn more about Astro and [their progressive approach to building websites](https://docs.astro.build/en/getting-started/). If you’re interested in swapping out our hardcoded data store, here are some resources on [connecting to databases with Deno](https://docs.deno.com/runtime/tutorials/connecting_to_databases/), including [Planetscale](https://docs.deno.com/runtime/tutorials/how_to_with_npm/planetscale/), [Redis](https://docs.deno.com/runtime/tutorials/how_to_with_npm/redis/), and more. Or you can learn how to [deploy your Astro project to Deno Deploy](https://deno.com/blog/astro-on-deno), or follow these guides on how to self-host Deno to [AWS](https://docs.deno.com/runtime/tutorials/aws_lightsail/), [Digital Ocean](https://docs.deno.com/runtime/tutorials/digital_ocean/), and [Google Cloud Run](https://docs.deno.com/runtime/tutorials/google_cloud_run/). --- # How to Deploy Deno to AWS Lambda > Step-by-step tutorial on deploying Deno applications to AWS Lambda. Learn about Docker containerization, ECR repositories, function configuration, and how to set up serverless Deno apps on AWS. URL: https://docs.deno.com/examples/tutorials/aws_lambda AWS Lambda is a serverless computing service provided by Amazon Web Services. It allows you to run code without provisioning or managing servers. Here's a step by step guide to deploying a Deno app to AWS Lambda using Docker. The pre-requisites for this are: - [`docker` CLI](https://docs.docker.com/reference/cli/docker/) - an [AWS account](https://aws.amazon.com) - [`aws` CLI](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html) ## Step 1: Create a Deno App Create a new Deno app using the following code: ```ts title="main.ts" Deno.serve((req) => new Response("Hello World!")); ``` Save this code in a file named `main.ts`. ## Step 2: Create a Dockerfile Create a new file named `Dockerfile` with the following content: ```Dockerfile # Set up the base image FROM public.ecr.aws/awsguru/aws-lambda-adapter:0.9.0 AS aws-lambda-adapter FROM denoland/deno:bin-1.45.2 AS deno_bin FROM debian:bookworm-20230703-slim AS deno_runtime COPY --from=aws-lambda-adapter /lambda-adapter /opt/extensions/lambda-adapter COPY --from=deno_bin /deno /usr/local/bin/deno ENV PORT=8000 EXPOSE 8000 RUN mkdir /var/deno_dir ENV DENO_DIR=/var/deno_dir # Copy the function code WORKDIR "/var/task" COPY . /var/task # Warmup caches RUN timeout 10s deno run -A main.ts || [ $? -eq 124 ] || exit 1 CMD ["deno", "run", "-A", "main.ts"] ``` This Dockerfile uses the [`aws-lambda-adapter`](https://github.com/awslabs/aws-lambda-web-adapter) project to adapt regular HTTP servers, like Deno's `Deno.serve`, to the AWS Lambda runtime API. We also use the `denoland/deno:bin-1.45.2` image to get the Deno binary and `debian:bookworm-20230703-slim` as the base image. The `debian:bookworm-20230703-slim` image is used to keep the image size small. The `PORT` environment variable is set to `8000` to tell the AWS Lambda adapter that we are listening on port `8000`. We set the `DENO_DIR` environment variable to `/var/deno_dir` to store cached Deno source code and transpiled modules in the `/var/deno_dir` directory. The warmup caches step is used to warm up the Deno cache before the function is invoked. This is done to reduce the cold start time of the function. These caches contain the compiled code and dependencies of your function code. This step starts your server for 10 seconds and then exits. When using a package.json, remember to run `deno install` to install `node_modules` from your `package.json` file before warming up the caches or running the function. ## Step 3: Build the Docker Image Build the Docker image using the following command: ```bash docker build -t hello-world . ``` ## Step 4: Create an ECR Docker repository and push the image With the AWS CLI, create an ECR repository and push the Docker image to it: ```bash aws ecr create-repository --repository-name hello-world --region us-east-1 | grep repositoryUri ``` This should output a repository URI that looks like `.dkr.ecr.us-east-1.amazonaws.com/hello-world`. Authenticate Docker with ECR, using the repository URI from the previous step: ```bash aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin .dkr.ecr.us-east-1.amazonaws.com ``` Tag the Docker image with the repository URI, again using the repository URI from the previous steps: ```bash docker tag hello-world:latest .dkr.ecr.us-east-1.amazonaws.com/hello-world:latest ``` Finally, push the Docker image to the ECR repository, using the repository URI from the previous steps: ```bash docker push .dkr.ecr.us-east-1.amazonaws.com/hello-world:latest ``` ## Step 5: Create an AWS Lambda function Now you can create a new AWS Lambda function from the AWS Management Console. 1. Go to the AWS Management Console and [navigate to the Lambda service](https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1). 2. Click on the "Create function" button. 3. Choose "Container image". 4. Enter a name for the function, like "hello-world". 5. Click on the "Browse images" button and select the image you pushed to ECR. 6. Click on the "Create function" button. 7. Wait for the function to be created. 8. In the "Configuration" tab, go to the "Function URL" section and click on "Create function URL". 9. Choose "NONE" for the auth type (this will make the lambda function publicly accessible). 10. Click on the "Save" button. ## Step 6: Test the Lambda function You can now visit your Lambda function's URL to see the response from your Deno app. 🦕 You have successfully deployed a Deno app to AWS Lambda using Docker. You can now use this setup to deploy more complex Deno apps to AWS Lambda. --- # examples/tutorials/aws_lightsail.md > Step-by-step tutorial on deploying Deno applications to AWS Lightsail. Learn about Docker containers, GitHub Actions automation, continuous deployment, and how to set up cost-effective cloud hosting for Deno apps. URL: https://docs.deno.com/examples/tutorials/aws_lightsail [Amazon Lightsail](https://aws.amazon.com/lightsail/) is the easiest and cheapest way to get started with Amazon Web Services. It allows you to host virtual machines and even entire container services. This How To guide will show you how to deploy a Deno app to Amazon Lightsail using Docker, Docker Hub, and GitHub Actions. Before continuing, make sure you have: - [`docker` CLI](https://docs.docker.com/engine/reference/commandline/cli/) - a [Docker Hub account](https://hub.docker.com) - a [GitHub account](https://github.com) - an [AWS account](https://aws.amazon.com/) ## Create Dockerfile and docker-compose.yml To focus on the deployment, our app will simply be a `main.ts` file that returns a string as an HTTP response: ```ts import { Application } from "jsr:@oak/oak"; const app = new Application(); app.use((ctx) => { ctx.response.body = "Hello from Deno and AWS Lightsail!"; }); await app.listen({ port: 8000 }); ``` Then, we'll create two files -- `Dockerfile` and `docker-compose.yml` -- to build the Docker image. In our `Dockerfile`, let's add: ```Dockerfile FROM denoland/deno EXPOSE 8000 WORKDIR /app ADD . /app RUN deno install --entrypoint main.ts CMD ["run", "--allow-net", "main.ts"] ``` Then, in our `docker-compose.yml`: ```yml version: "3" services: web: build: . container_name: deno-container image: deno-image ports: - "8000:8000" ``` Let's test this locally by running `docker compose -f docker-compose.yml build`, then `docker compose up`, and going to `localhost:8000`. ![hello world from localhost](./images/how-to/aws-lightsail/hello-world-from-localhost.png) It works! ## Build, Tag, and Push to Docker Hub First, let's sign into [Docker Hub](https://hub.docker.com/repositories) and create a repository. Let's name it `deno-on-aws-lightsail`. Then, let's tag and push our new image, replacing `username` with yours: Then, let's build the image locally. Note our `docker-compose.yml` file will name the build `deno-image`. ```shell docker compose -f docker-compose.yml build ``` Let's [tag](https://docs.docker.com/engine/reference/commandline/tag/) the local image with `{{ username }}/deno-on-aws-lightsail`: ```shell docker tag deno-image {{ username }}/deno-on-aws-lightsail ``` We can now push the image to Docker Hub: ```shell docker push {{ username }}/deno-on-aws-lightsail ``` After that succeeds, you should be able to see the new image on your Docker Hub repository: ![new image on docker hub](./images/how-to/aws-lightsail/new-image-on-docker-hub.png) ## Create and Deploy to a Lightsail Container Let's head over to [the Amazon Lightsail console](https://lightsail.aws.amazon.com/ls/webapp/home/container-services). Then click "Containers" and "Create container service". Half way down the page, click "Setup your first Deployment" and select "Specify a custom deployment". You can write whatever container name you'd like. In `Image`, be sure to use `{{ username }}/{{ image }}` that you have set in your Docker Hub. For this example, it is `lambtron/deno-on-aws-lightsail`. Let's click `Add open ports` and add `8000`. Finally, under `PUBLIC ENDPOINT`, select the container name that you just created. The full form should look like below: ![create container service interface](./images/how-to/aws-lightsail/create-container-service-on-aws.png) When you're ready, click "Create container service". After a few moments, your new container should be deployed. Click on the public address and you should see your Deno app: ![Hello world from Deno and AWS Lightsail](./images/how-to/aws-lightsail/hello-world-from-deno-and-aws-lightsail.png) ## Automate using GitHub Actions In order to automate that process, we'll use the `aws` CLI with the [`lightsail` subcommand](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/lightsail/push-container-image.html). The steps in our GitHub Actions workflow will be: 1. Checkout the repo 2. Build our app as a Docker image locally 3. Install and authenticate AWS CLI 4. Push local Docker image to AWS Lightsail Container Service via CLI Pre-requisites for this GitHub Action workflow to work: - an AWS Lightsail Container Instance is created (see section above) - IAM user and relevant permissions set. ([Learn more about managing access to Amazon Lightsail for an IAM user.](https://docs.aws.amazon.com/lightsail/latest/userguide/amazon-lightsail-managing-access-for-an-iam-user.html)) - `AWS_ACCESS_KEY_ID` and `AWS_SUCCESS_ACCESS_KEY` for your user with permissions. (Follow [this AWS guide](https://lightsail.aws.amazon.com/ls/docs/en_us/articles/lightsail-how-to-set-up-access-keys-to-use-sdk-api-cli) to get generate an `AWS_ACCESS_KEY_ID` and `AWS_SUCCESS_ACCESS_KEY`.) Let's create a new file `container.template.json`, which contains configuration for how to make the service container deployment. Note the similarities these option values have with the inputs we entered manually in the previous section. ```json { "containers": { "app": { "image": "", "environment": { "APP_ENV": "release" }, "ports": { "8000": "HTTP" } } }, "publicEndpoint": { "containerName": "app", "containerPort": 8000, "healthCheck": { "healthyThreshold": 2, "unhealthyThreshold": 2, "timeoutSeconds": 5, "intervalSeconds": 10, "path": "/", "successCodes": "200-499" } } } ``` Let's add the below to your `.github/workflows/deploy.yml` file: ```yml name: Build and Deploy to AWS Lightsail on: push: branches: - main env: AWS_REGION: us-west-2 AWS_LIGHTSAIL_SERVICE_NAME: container-service-2 jobs: build_and_deploy: name: Build and Deploy runs-on: ubuntu-latest steps: - name: Checkout main uses: actions/checkout@v4 - name: Install Utilities run: | sudo apt-get update sudo apt-get install -y jq unzip - name: Install AWS Client run: | curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" unzip awscliv2.zip sudo ./aws/install || true aws --version curl "https://s3.us-west-2.amazonaws.com/lightsailctl/latest/linux-amd64/lightsailctl" -o "lightsailctl" sudo mv "lightsailctl" "/usr/local/bin/lightsailctl" sudo chmod +x /usr/local/bin/lightsailctl - name: Configure AWS credentials uses: aws-actions/configure-aws-credentials@v1 with: aws-region: ${{ env.AWS_REGION }} aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} - name: Build Docker Image run: docker build -t ${{ env.AWS_LIGHTSAIL_SERVICE_NAME }}:release . - name: Push and Deploy run: | service_name=${{ env.AWS_LIGHTSAIL_SERVICE_NAME }} aws lightsail push-container-image \ --region ${{ env.AWS_REGION }} \ --service-name ${service_name} \ --label ${service_name} \ --image ${service_name}:release aws lightsail get-container-images --service-name ${service_name} | jq --raw-output ".containerImages[0].image" > image.txt jq --arg image $(cat image.txt) '.containers.app.image = $image' container.template.json > container.json aws lightsail create-container-service-deployment --service-name ${service_name} --cli-input-json file://$(pwd)/container.json ``` Whoa there is a lot going on here! The last two steps are most important: `Build Docker Image` and `Push and Deploy`. ```shell docker build -t ${{ env.AWS_LIGHTSAIL_SERVICE_NAME }}:release . ``` This command builds our Docker image with the name `container-service-2` and tags it `release`. ```shell aws lightsail push-container-image ... ``` This command pushes the local image to our Lightsail container. ```shell aws lightsail get-container-images --service-name ${service_name} | jq --raw-output ".containerImages[0].image" > image.txt ``` This command retrieves the image information and, using [`jq`](https://stedolan.github.io/jq/), parses it and saves the image name in a local file `image.txt`. ```shell jq --arg image $(cat image.txt) '.containers.app.image = $image' container.template.json > container.json ``` This command uses the image name saved in `image.txt` and `container.template.json` and creates a new options file called `container.json`. This options file will be passed to `aws lightsail` for the final deployment in the next step. ```shell aws lightsail create-container-service-deployment --service-name ${service_name} --cli-input-json file://$(pwd)/container.json ``` Finally, this command creates a new deployment using the `service_name`, along with the config settings in `container.json`. When you push to GitHub and the Action succeeds, you'll be able to see your new Deno app on AWS: ![deno on aws](./images/how-to/aws-lightsail/hello-world-from-deno-and-aws-lightsail.png) 🦕 Now you can deploy a Deno app to Amazon Lightsail using Docker, Docker Hub, and GitHub Actions. --- # Getting Started with OpenTelemetry in Deno > Set up basic OpenTelemetry instrumentation in a Deno application. This tutorial covers creating a simple HTTP server with custom metrics and traces, and viewing the telemetry data. URL: https://docs.deno.com/examples/tutorials/basic_opentelemetry OpenTelemetry provides powerful observability tools for your applications. With Deno's built-in OpenTelemetry support, you can easily instrument your code to collect metrics, traces, and logs. This tutorial will walk you through setting up a simple Deno application with OpenTelemetry instrumentation. ## Prerequisites - Deno 2.3 or later ## Step 1: Create a Simple HTTP Server Let's start by creating a basic HTTP server that simulates a small web application: ```ts title="server.ts" import { metrics, trace } from "npm:@opentelemetry/api@1"; // Create a tracer and meter for our application const tracer = trace.getTracer("my-server", "1.0.0"); const meter = metrics.getMeter("my-server", "1.0.0"); // Create some metrics const requestCounter = meter.createCounter("http_requests_total", { description: "Total number of HTTP requests", }); const requestDuration = meter.createHistogram("http_request_duration_ms", { description: "HTTP request duration in milliseconds", unit: "ms", }); // Start the server Deno.serve({ port: 8000 }, (req) => { // Record the start time for measuring request duration const startTime = performance.now(); // Create a span for this request return tracer.startActiveSpan("handle_request", async (span) => { try { // Extract the path from the URL const url = new URL(req.url); const path = url.pathname; // Add attributes to the span span.setAttribute("http.route", path); span.setAttribute("http.method", req.method); span.updateName(`${req.method} ${path}`); // Add an event to the span span.addEvent("request_started", { timestamp: startTime, request_path: path, }); // Simulate some processing time const waitTime = Math.random() * 100; await new Promise((resolve) => setTimeout(resolve, waitTime)); // Add another event to the span span.addEvent("processing_completed"); // Create the response const response = new Response(`Hello from ${path}!`, { headers: { "Content-Type": "text/plain" }, }); // Record metrics requestCounter.add(1, { method: req.method, path, status: 200, }); const duration = performance.now() - startTime; requestDuration.record(duration, { method: req.method, path, }); span.setAttribute("request.duration_ms", duration); return response; } catch (error) { // Record error in span if (error instanceof Error) { span.recordException(error); span.setStatus({ code: trace.SpanStatusCode.ERROR, message: error.message, }); } return new Response("Internal Server Error", { status: 500 }); } finally { // Always end the span span.end(); } }); }); ``` This server: 1. Creates a tracer and meter for our application 2. Sets up metrics to count requests and measure their duration 3. Creates a span for each request with attributes and events 4. Simulates some processing time 5. Records metrics for each request ## Step 2: Run the Server with OpenTelemetry Enabled To run the server with OpenTelemetry, use these flags: ```sh OTEL_DENO=true OTEL_SERVICE_NAME=my-server deno run --unstable-otel --allow-net server.ts ``` ## Step 3: Create a Test Client Let's create a simple client to send requests to our server: ```ts title="client.ts" // Send 10 requests to different paths for (let i = 0; i < 10; i++) { const path = ["", "about", "users", "products", "contact"][i % 5]; const url = `http://localhost:8000/${path}`; console.log(`Sending request to ${url}`); try { const response = await fetch(url); const text = await response.text(); console.log(`Response from ${url}: ${text}`); } catch (error) { console.error(`Error fetching ${url}:`, error); } } ``` ## Step 4: Run the Client In a separate terminal, run the client: ```sh deno run --allow-net client.ts ``` ## Step 5: View the Telemetry Data By default, Deno exports telemetry data to `http://localhost:4318` using the OTLP protocol. You'll need an OpenTelemetry collector to receive and visualize this data. ### Setting up a Local Collector The quickest way to get started is with a local LGTM stack (Loki, Grafana, Tempo, Mimir) in Docker: ```sh docker run --name lgtm -p 3000:3000 -p 4317:4317 -p 4318:4318 --rm -ti \ -v "$PWD"/lgtm/grafana:/data/grafana \ -v "$PWD"/lgtm/prometheus:/data/prometheus \ -v "$PWD"/lgtm/loki:/data/loki \ -e GF_PATHS_DATA=/data/grafana \ docker.io/grafana/otel-lgtm:0.8.1 ``` Then access Grafana at http://localhost:3000 (username: admin, password: admin). In Grafana, you can: 1. View **Traces** in Tempo to see the individual request spans 2. View **Metrics** in Mimir/Prometheus to see request counts and durations 3. View **Logs** in Loki to see any logs from your application ## Understanding What You're Seeing ### Traces In the Traces view, you'll see spans for: - Each HTTP request processed by your server - Each fetch request made by your client - The relationships between these spans Click on any span to see its details, including: - Duration - Attributes (http.route, http.method, etc.) - Events (request_started, processing_completed) ### Metrics In the Metrics view, you can query for: - `http_requests_total` - The counter tracking the number of HTTP requests - `http_request_duration_ms` - The histogram of request durations You can also see built-in Deno metrics like: - `http.server.request.duration` - `http.server.active_requests` ### Logs In the Logs view, you'll see all console logs from your application with correct trace context. ## Troubleshooting If you're not seeing data in your collector: 1. Check that you've set `OTEL_DENO=true` and used the `--unstable-otel` flag 2. Verify the collector is running and accessible at the default endpoint 3. Check if you need to set `OTEL_EXPORTER_OTLP_ENDPOINT` to a different URL 4. Look for errors in your Deno console output Remember that OpenTelemetry support in Deno is still marked as unstable and may change in future versions. 🦕 This tutorial provides a simple starting point for users who want to experiment with OpenTelemetry in Deno without diving into more complex concepts immediately. This basic example can be extended in many ways: - Add more custom metrics for business logic - Create additional spans for important operations - Use baggage to pass context attributes between services - Set up alerts based on metrics thresholds For more advanced usage, see our [Distributed Tracing with Context Propagation](/examples/otel_span_propagation_tutorial/) tutorial. --- # Behavior-Driven Development (BDD) > Implementing Behavior-Driven Development with Deno's Standard Library's BDD module. Create readable, well organised tests with effective assertions. URL: https://docs.deno.com/examples/tutorials/bdd Behavior-Driven Development (BDD) is an approach to software development that encourages collaboration between developers, QA, and non-technical stakeholders. BDD focuses on defining the behavior of an application through examples written in a natural, ubiquitous language that all stakeholders can understand. Deno's Standard Library provides a BDD-style testing module that allows you to structure tests in a way that's both readable for non-technical stakeholders and practical for implementation. In this tutorial, we'll explore how to use the BDD module to create descriptive test suites for your applications. ## Introduction to BDD BDD extends [Test-Driven Development](https://en.wikipedia.org/wiki/Test-driven_development) (TDD) by writing tests in a natural language that is easy to read. Rather than thinking about "tests," BDD encourages us to consider "specifications" or "specs" that describe how software should behave from the user's perspective. This approach helps to keep tests focused on what the code should do rather than how it is implemented. The basic elements of BDD include: - **Describe** blocks that group related specifications - **It** statements that express a single behavior - **Before/After** hooks for setup and teardown operations ## Using Deno's BDD module To get started with BDD testing in Deno, we'll use the `@std/testing/bdd` module from the [Deno Standard Library](https://jsr.io/@std/testing/doc/bdd). First, let's import the necessary functions: ```ts import { afterAll, afterEach, beforeAll, beforeEach, describe, it, } from "jsr:@std/testing/bdd"; import { assertEquals, assertThrows } from "jsr:@std/assert"; ``` These imports provide the core BDD functions: - `describe` creates a block that groups related tests - `it` declares a test case that verifies a specific behavior - `beforeEach`/`afterEach` run before or after each test case - `beforeAll`/`afterAll` run once before or after all tests in a describe block We'll also use assertion functions from [`@std/assert`](https://jsr.io/@std/assert) to verify our expectations. ### Writing your first BDD test Let's create a simple calculator module and test it using BDD: ```ts title="calculator.ts" export class Calculator { private value: number = 0; constructor(initialValue: number = 0) { this.value = initialValue; } add(number: number): Calculator { this.value += number; return this; } subtract(number: number): Calculator { this.value -= number; return this; } multiply(number: number): Calculator { this.value *= number; return this; } divide(number: number): Calculator { if (number === 0) { throw new Error("Cannot divide by zero"); } this.value /= number; return this; } get result(): number { return this.value; } } ``` Now, let's test this calculator using the BDD style: ```ts title="calculator_test.ts" import { afterEach, beforeEach, describe, it } from "jsr:@std/testing/bdd"; import { assertEquals, assertThrows } from "jsr:@std/assert"; import { Calculator } from "./calculator.ts"; describe("Calculator", () => { let calculator: Calculator; // Before each test, create a new Calculator instance beforeEach(() => { calculator = new Calculator(); }); it("should initialize with zero", () => { assertEquals(calculator.result, 0); }); it("should initialize with a provided value", () => { const initializedCalculator = new Calculator(10); assertEquals(initializedCalculator.result, 10); }); describe("add method", () => { it("should add a positive number correctly", () => { calculator.add(5); assertEquals(calculator.result, 5); }); it("should handle negative numbers", () => { calculator.add(-5); assertEquals(calculator.result, -5); }); it("should be chainable", () => { calculator.add(5).add(10); assertEquals(calculator.result, 15); }); }); describe("subtract method", () => { it("should subtract a number correctly", () => { calculator.subtract(5); assertEquals(calculator.result, -5); }); it("should be chainable", () => { calculator.subtract(5).subtract(10); assertEquals(calculator.result, -15); }); }); describe("multiply method", () => { beforeEach(() => { // For multiplication tests, start with value 10 calculator = new Calculator(10); }); it("should multiply by a number correctly", () => { calculator.multiply(5); assertEquals(calculator.result, 50); }); it("should be chainable", () => { calculator.multiply(2).multiply(3); assertEquals(calculator.result, 60); }); }); describe("divide method", () => { beforeEach(() => { // For division tests, start with value 10 calculator = new Calculator(10); }); it("should divide by a number correctly", () => { calculator.divide(2); assertEquals(calculator.result, 5); }); it("should throw when dividing by zero", () => { assertThrows( () => calculator.divide(0), Error, "Cannot divide by zero", ); }); }); }); ``` To run this test, use the `deno test` command: ```sh deno test calculator_test.ts ``` You'll see output similar to this: ```sh running 1 test from file:///path/to/calculator_test.ts Calculator ✓ should initialize with zero ✓ should initialize with a provided value add method ✓ should add a positive number correctly ✓ should handle negative numbers ✓ should be chainable subtract method ✓ should subtract a number correctly ✓ should be chainable multiply method ✓ should multiply by a number correctly ✓ should be chainable divide method ✓ should divide by a number correctly ✓ should throw when dividing by zero ok | 11 passed | 0 failed (234ms) ``` ## Organizing tests with nested describe blocks One of the powerful features of BDD is the ability to nest `describe` blocks, which helps organize tests hierarchically. In the calculator example, we grouped tests for each method within their own `describe` blocks. This not only makes the tests more readable, but also makes it easier to locate issues when the test fails. You can nest `describe` blocks, but be cautious of nesting too deep as excessive nesting can make tests harder to follow. ## Hooks The BDD module provides four hooks: - `beforeEach` runs before each test in the current describe block - `afterEach` runs after each test in the current describe block - `beforeAll` runs once before all tests in the current describe block - `afterAll` runs once after all tests in the current describe block ### beforeEach/afterEach These hooks are ideal for: - Setting up a fresh test environment for each test - Cleaning up resources after each test - Ensuring test isolation In the calculator example, we used `beforeEach` to create a new calculator instance before each test, ensuring each test starts with a clean state. ### beforeAll/afterAll These hooks are useful for: - Expensive setup operations that can be shared across tests - Setting up and tearing down database connections - Creating and cleaning up shared resources Here's an example of how you might use `beforeAll` and `afterAll`: ```ts describe("Database operations", () => { let db: Database; beforeAll(async () => { // Connect to the database once before all tests db = await Database.connect(TEST_CONNECTION_STRING); await db.migrate(); }); afterAll(async () => { // Disconnect after all tests are complete await db.close(); }); it("should insert a record", async () => { const result = await db.insert({ name: "Test" }); assertEquals(result.success, true); }); it("should retrieve a record", async () => { const record = await db.findById(1); assertEquals(record.name, "Test"); }); }); ``` ## Gherkin vs. JavaScript-style BDD If you're familiar with Cucumber or other BDD frameworks, you might be expecting Gherkin syntax with "Given-When-Then" statements. Deno's BDD module uses a JavaScript-style syntax rather than Gherkin. This approach is similar to other JavaScript testing frameworks like Mocha or Jasmine. However, you can still follow BDD principles by: 1. Writing clear, behavior-focused test descriptions 2. Structuring your tests to reflect user stories 3. Following the "Arrange-Act-Assert" pattern in your test implementations For example, you can structure your `it` blocks to mirror the Given-When-Then format: ```ts describe("Calculator", () => { it("should add numbers correctly", () => { // Given const calculator = new Calculator(); // When calculator.add(5); // Then assertEquals(calculator.result, 5); }); }); ``` If you need full Gherkin support with natural language specifications, consider using a dedicated BDD framework that integrates with Deno, such as [cucumber-js](https://github.com/cucumber/cucumber-js). ## Best Practices for BDD with Deno ### Write your tests for humans to read BDD tests should read like documentation. Use clear, descriptive language in your `describe` and `it` statements: ```ts // Good describe("User authentication", () => { it("should reject login with incorrect password", () => { // Test code }); }); // Not good describe("auth", () => { it("bad pw fails", () => { // Test code }); }); ``` ### Keep tests focused Each test should verify a single behavior. Avoid testing multiple behaviors in a single `it` block: ```ts // Good it("should add an item to the cart", () => { // Test adding to cart }); it("should calculate the correct total", () => { // Test total calculation }); // Bad it("should add an item and calculate total", () => { // Test adding to cart // Test total calculation }); ``` ### Use context-specific setup When tests within a describe block need different setup, use nested describes with their own `beforeEach` hooks rather than conditional logic: ```ts // Good describe("User operations", () => { describe("when user is logged in", () => { beforeEach(() => { // Setup logged-in user }); it("should show the dashboard", () => { // Test }); }); describe("when user is logged out", () => { beforeEach(() => { // Setup logged-out state }); it("should redirect to login", () => { // Test }); }); }); // Avoid describe("User operations", () => { beforeEach(() => { // Setup base state if (isLoggedInTest) { // Setup logged-in state } else { // Setup logged-out state } }); it("should show dashboard when logged in", () => { isLoggedInTest = true; // Test }); it("should redirect to login when logged out", () => { isLoggedInTest = false; // Test }); }); ``` ### Handle asynchronous tests properly When testing asynchronous code, remember to: - Mark your test functions as `async` - Use `await` for promises - Handle errors properly ```ts it("should fetch user data asynchronously", async () => { const user = await fetchUser(1); assertEquals(user.name, "John Doe"); }); ``` 🦕 By following the BDD principles and practices outlined in this tutorial, you can build more reliable software and solidify your reasoning about the 'business logic' of your code. Remember that BDD is not just about the syntax or tools but about the collaborative approach to defining and verifying application behavior. The most successful BDD implementations combine these technical practices with regular conversations between developers, testers, product and business stakeholders. To continue learning about testing in Deno, explore other modules in the Standard Library's testing suite, such as [mocking](/examples/mocking_tutorial/) and [snapshot testing](/examples/snapshot_tutorial/). --- # Chat application with WebSockets > A tutorial on building a real-time chat app using Deno WebSockets. Learn how to create a WebSocket server with Oak, handle multiple client connections, manage state, and build an interactive chat interface with HTML, CSS, and JavaScript. URL: https://docs.deno.com/examples/tutorials/chat_app WebSockets are a powerful tool for building real-time applications. They allow for bidirectional communication between the client and server without the need for constant polling. A frequent use case for WebSockets are chat applications. In this tutorial we'll create a simple chat app using Deno and the built in [WebSockets API](/api/web/websockets). The chat app will allow multiple chat clients to connect to the same backend and send group messages. After a client enters a username, they can then start sending messages to other online clients. Each client also displays the list of currently active users. You can see the [finished chat app on GitHub](https://github.com/denoland/tutorial-with-websockets). ![Chat app UI](./images/websockets.gif) ## Initialize a new project First, create a new directory for your project and navigate into it. ```sh deno init chat-app cd deno-chat-app ``` ## Build the backend We'll start by building the backend server that will handle the WebSocket connections and broadcast messages to all connected clients. We'll use the [`oak`](https://jsr.io/@oak/oak) middleware framework to set up our server, clients can connect to the server, send messages and receive updates about other connected users. Additionally the server will serve the static HTML, CSS and JavaScript files that make up the chat client. ### Import dependencies First, we'll need to import the necessary dependencies. Use the `deno add` command to add Oak to your project: ```sh deno add jsr:@oak/oak ``` ### Set up the server In your `main.ts` file, add the following code: ```ts title="main.ts" import { Application, Context, Router } from "@oak/oak"; import ChatServer from "./ChatServer.ts"; const app = new Application(); const port = 8080; const router = new Router(); const server = new ChatServer(); router.get("/start_web_socket", (ctx: Context) => server.handleConnection(ctx)); app.use(router.routes()); app.use(router.allowedMethods()); app.use(async (context) => { await context.send({ root: Deno.cwd(), index: "public/index.html", }); }); console.log("Listening at http://localhost:" + port); await app.listen({ port }); ``` Next, create a new file called `ChatServer.ts` in the same directory as your `main.ts` file. In this file we'll put the logic for handling the WebSocket connections: ```ts title="ChatServer.ts" import { Context } from "@oak/oak"; type WebSocketWithUsername = WebSocket & { username: string }; type AppEvent = { event: string; [key: string]: any }; export default class ChatServer { private connectedClients = new Map(); public async handleConnection(ctx: Context) { const socket = await ctx.upgrade() as WebSocketWithUsername; const username = ctx.request.url.searchParams.get("username"); if (this.connectedClients.has(username)) { socket.close(1008, `Username ${username} is already taken`); return; } socket.username = username; socket.onopen = this.broadcastUsernames.bind(this); socket.onclose = () => { this.clientDisconnected(socket.username); }; socket.onmessage = (m) => { this.send(socket.username, m); }; this.connectedClients.set(username, socket); console.log(`New client connected: ${username}`); } private send(username: string, message: any) { const data = JSON.parse(message.data); if (data.event !== "send-message") { return; } this.broadcast({ event: "send-message", username: username, message: data.message, }); } private clientDisconnected(username: string) { this.connectedClients.delete(username); this.broadcastUsernames(); console.log(`Client ${username} disconnected`); } private broadcastUsernames() { const usernames = [...this.connectedClients.keys()]; this.broadcast({ event: "update-users", usernames }); console.log("Sent username list:", JSON.stringify(usernames)); } private broadcast(message: AppEvent) { const messageString = JSON.stringify(message); for (const client of this.connectedClients.values()) { client.send(messageString); } } } ``` This code sets up a `handleConnection` method that is called when a new WebSocket connection is established. It receives a Context object from the Oak framework and upgrades it to a WebSocket connection. It extracts the username from the URL query parameters. If the username is already taken (i.e., exists in connectedClients), it closes the socket with an appropriate message. Otherwise, it sets the username property on the socket, assigns event handlers, and adds the socket to `connectedClients`. When the socket opens, it triggers the `broadcastUsernames` method, which sends the list of connected usernames to all clients. When the socket closes, it calls the `clientDisconnected` method to remove the client from the list of connected clients. When a message of type `send-message` is received, it broadcasts the message to all connected clients, including the sender’s username. ## Build the frontend We'll build a simple UI that shows a text input and a send button and displays the sent messages, alongside a list of users in the chat. ### HTML In your new project directory, create a `public` folder and add an `index.html` file and add the following code: ```html title="index.html" Deno Chat App

🦕 Deno Chat App

``` ### CSS If you'd like to style your chat app, create a `style.css` file in the `public` folder and add this [pre-made CSS](https://raw.githubusercontent.com/denoland/tutorial-with-websockets/refs/heads/main/public/style.css). ### JavaScript We'll set up the client side JavaScript in an `app.js` file, you'll have seen it linked in the HTML we just wrote. In the `public` folder and add an `app.js` file with the following code: ```js title="app.js" const myUsername = prompt("Please enter your name") || "Anonymous"; const url = new URL(`./start_web_socket?username=${myUsername}`, location.href); url.protocol = url.protocol.replace("http", "ws"); const socket = new WebSocket(url); socket.onmessage = (event) => { const data = JSON.parse(event.data); switch (data.event) { case "update-users": updateUserList(data.usernames); break; case "send-message": addMessage(data.username, data.message); break; } }; function updateUserList(usernames) { const userList = document.getElementById("users"); userList.replaceChildren(); for (const username of usernames) { const listItem = document.createElement("li"); listItem.textContent = username; userList.appendChild(listItem); } } function addMessage(username, message) { const template = document.getElementById("message"); const clone = template.content.cloneNode(true); clone.querySelector("span").textContent = username; clone.querySelector("p").textContent = message; document.getElementById("conversation").prepend(clone); } const inputElement = document.getElementById("data"); inputElement.focus(); const form = document.getElementById("form"); form.onsubmit = (e) => { e.preventDefault(); const message = inputElement.value; inputElement.value = ""; socket.send(JSON.stringify({ event: "send-message", message })); }; ``` This code prompts the user for a username, then creates a WebSocket connection to the server with the username as a query parameter. It listens for messages from the server and either updates the list of connected users or adds a new message to the chat window. It also sends messages to the server when the user submits the form either by pressing enter or clicking the send button. We use an [HTML template](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/template) to scaffold out the new messages to show in the chat window. ## Run the server To run the server we'll need to grant the necessary permissions to Deno. In your `deno.json` file, update the `dev` task to allow read and network access: ```diff title="deno.json" -"dev": "deno run --watch main.ts" +"dev": "deno run --allow-net --allow-read --watch main.ts" ``` Now if you visit [http://localhost:8080](http://localhost:8080/) you will be able to start a chat session. You can open 2 simultaneous tabs and try chatting with yourself. ![Chat app UI](./images/websockets.gif) 🦕 Now you can use WebSockets with Deno you're ready to build all kinds of realtime applications! WebSockets can be used to build realtime dashboards, games and collaborative editing tools and much more! If you're looking for ways to expand upon your chat app, perhaps you could consider adding data to the messages to allow you to style messages differently if they're sent from you or someone else. Whatever you're building, Deno will WebSocket to ya! --- # Updating from CommonJS to ESM > Step-by-step guide to migrating Node.js projects from CommonJS to ESM modules. Learn about import/export syntax changes, module resolution differences, and how to use modern JavaScript features in Deno. URL: https://docs.deno.com/examples/tutorials/cjs_to_esm If your Node.js project uses CommonJS modules (e.g. it uses `require`), you'll need to update your code to use [ECMAScript modules (ESM)](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) to run it in Deno. This guide will help you update your code to use ESM syntax. ## Module imports and exports Deno supports [ECMAScript modules](/runtime/fundamentals/modules/) exclusively. If your Node.js code uses [`require`](https://nodejs.org/api/modules.html#modules-commonjs-modules), you should update it to use `import` statements instead. If your internal code uses CommonJS-style exports, those will also need to be updated. A typical CommonJS-style project might look similar to this: ```js title="add_numbers.js" module.exports = function addNumbers(num1, num2) { return num1 + num2; }; ``` ```js title="index.js" const addNumbers = require("./add_numbers"); console.log(addNumbers(2, 2)); ``` To convert these to [ECMAScript modules](/runtime/fundamentals/modules/), we'll make a few minor changes: ```js title="add_numbers.js" export function addNumbers(num1, num2) { return num1 + num2; } ``` ```js title="index.js" import { addNumbers } from "./add_numbers.js"; console.log(addNumbers(2, 2)); ``` Exports: | CommonJS | ECMAScript modules | | ------------------------------------ | ---------------------------------- | | `module.exports = function add() {}` | `export default function add() {}` | | `exports.add = function add() {}` | `export function add() {}` | Imports: | CommonJS | ECMAScript modules | | ------------------------------------------ | ---------------------------------------- | | `const add = require("./add_numbers");` | `import add from "./add_numbers.js";` | | `const { add } = require("./add_numbers")` | `import { add } from "./add_numbers.js"` | ### Quick fix with VS Code If you are using VS Code, you can use its built-in feature to convert CommonJS to ES6 modules. Right-click on the `require` statement, or the lightbulb icon and select `Quick Fix` and then `Convert to ES module`. ![Quick Fix](./images/quick-fix.png) ### CommonJS vs ECMAScript resolution An important distinction between the two module systems is that ECMAScript resolution requires the full specifier **including the file extension**. Omitting the file extension, and special handling of `index.js`, are features unique to CommonJS. The benefit of the ECMAScript resolution is that it works the same across the browser, Deno, and other runtimes. | CommonJS | ECMAScript modules | | -------------------- | ----------------------------- | | `"./add_numbers"` | `"./add_numbers.js"` | | `"./some/directory"` | `"./some/directory/index.js"` | :::tip Deno can add all the missing file extensions for you by running `deno lint --fix`. Deno's linter comes with a `no-sloppy-imports` rule that will show a linting error when an import path doesn't contain the file extension. ::: 🦕 Now that you know how to port from CJS to ESM you can take advantage of the modern features that ESM offers, such as async module loading, interop with browsers, better readability, standardization and future proofing. --- # Deploying Deno to Cloudflare Workers > Step-by-step tutorial on deploying Deno functions to Cloudflare Workers. Learn how to configure denoflare, create worker modules, test locally, and deploy your code to Cloudflare's global edge network. URL: https://docs.deno.com/examples/tutorials/cloudflare_workers Cloudflare Workers allows you to run JavaScript on Cloudflare's edge network. This is a short How To guide on deploying a Deno function to Cloudflare Workers. Note: You would only be able to deploy [Module Workers](https://developers.cloudflare.com/workers/learning/migrating-to-module-workers/) instead of web servers or apps. ## Setup `denoflare` In order to deploy Deno to Cloudflare, we'll use this community created CLI [`denoflare`](https://denoflare.dev/). [Install it](https://denoflare.dev/cli/#installation): ```shell deno install --unstable-worker-options --allow-read --allow-net --allow-env --allow-run --name denoflare --force \ https://raw.githubusercontent.com/skymethod/denoflare/v0.6.0/cli/cli.ts ``` ## Create your function In a new directory, let's create a `main.ts` file, which will contain our Module Worker function: ```ts export default { fetch(request: Request): Response { return new Response("Hello, world!"); }, }; ``` At the very minimum, a Module Worker function must `export default` an object that exposes a `fetch` function, which returns a `Response` object. You can test this locally by running: ```shell denoflare serve main.ts ``` If you go to `localhost:8080` in your browser, you'll see the response will say: ```console Hello, world! ``` ## Configure `.denoflare` The next step is to create a `.denoflare` config file. In it, let's add: ```json { "$schema": "https://raw.githubusercontent.com/skymethod/denoflare/v0.5.11/common/config.schema.json", "scripts": { "main": { "path": "/absolute/path/to/main.ts", "localPort": 8000 } }, "profiles": { "myprofile": { "accountId": "abcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "apiToken": "abcxxxxxxxxx_-yyyyyyyyyyyy-11-dddddddd" } } } ``` You can find your `accountId` by going to your [Cloudflare dashboard](https://dash.cloudflare.com/), clicking "Workers", and finding "Account ID" on the right side. You can generate an `apiToken` from your [Cloudflare API Tokens settings](https://dash.cloudflare.com/profile/api-tokens). When you create an API token, be sure to use the template "Edit Cloudflare Workers". After you add both to your `.denoflare` config, let's try pushing it to Cloudflare: ```console denoflare push main ``` Next, you can view your new function in your Cloudflare account: ![New function on Cloudflare Workers](./images/how-to/cloudflare-workers/main-on-cloudflare.png) Boom! --- # Connecting to databases > A guide to database connectivity in Deno. Learn how to use MySQL, PostgreSQL, MongoDB, SQLite, Firebase, Supabase, and popular ORMs to build data-driven applications with TypeScript. URL: https://docs.deno.com/examples/tutorials/connecting_to_databases It is common for applications to store and retrieve data from databases. Deno supports connecting to many database management systems. The Deno community has published a number of third-party modules that make it easy to connect to popular databases like MySQL, Postgres, and MongoDB. They are hosted at Deno's third-party module site [deno.land/x](https://deno.land/x). ## MySQL [deno_mysql](https://deno.land/x/mysql) is a MySQL and MariaDB database driver for Deno. ### Connect to MySQL with deno_mysql First import the `mysql` module and create a new client instance. Then connect to the database passing an object with the connection details: ```ts title="main.js" import { Client } from "https://deno.land/x/mysql/mod.ts"; const client = await new Client().connect({ hostname: "127.0.0.1", username: "root", db: "dbname", password: "password", }); ``` Once connected, you can execute queries, insert data and retrive information. ## Postgres [deno-postgres](https://deno.land/x/postgres) is a lightweight PostgreSQL driver for Deno focused on developer experience. ### Connect to Postgres with deno-postgres First, import the `Client` class from the `deno-postgres` module and create a new client instance. Then connect to the database passing an object with the connection details: ```ts import { Client } from "https://deno.land/x/postgres/mod.ts"; const client = new Client({ user: "user", database: "dbname", hostname: "127.0.0.1", port: 5432, password: "password", }); await client.connect(); ``` ### Connect to Postgres with postgresjs [postgresjs](https://deno.land/x/postgresjs) is a full-featured Postgres client for Node.js and Deno. Import the `postgres` module and create a new client instance. Then connect to the database passing a connection string as an argument: ```js import postgres from "https://deno.land/x/postgresjs/mod.js"; const sql = postgres("postgres://username:password@host:port/database"); ``` ## MongoDB We suggest using [npm specifiers](/runtime/fundamentals/node/#using-npm-packages) to work with the official [MongoDB driver on npm](https://www.npmjs.com/package/mongodb). You can learn more about how to work with the driver [in the official docs](https://www.mongodb.com/docs/drivers/node/current/). The only difference using this module in the context of Deno will be how you import the module using an `npm:` specifier. Import the MongoDB driver, set up connection configuration then connect to a MongoDB instance. You can then perform operations like inserting documents into a collection before closing the connection: ```ts title="main.js" import { MongoClient } from "npm:mongodb@6"; const url = "mongodb://localhost:27017"; const client = new MongoClient(url); const dbName = "myProject"; await client.connect(); console.log("Connected successfully to server"); // Get a reference to a collection const db = client.db(dbName); const collection = db.collection("documents"); // Execute an insert operation const insertResult = await collection.insertMany([{ a: 1 }, { a: 2 }]); console.log("Inserted documents =>", insertResult); client.close(); ``` ## SQLite There are multiple solutions to connect to SQLite in Deno: ### Connect to SQLite using the `node:sqlite` module _`node:sqlite` module has been added in Deno v2.2._ ```ts import { DatabaseSync } from "node:sqlite"; const database = new DatabaseSync("test.db"); const result = database.prepare("select sqlite_version()").get(); console.log(result); db.close(); ``` ### Connect to SQLite with the FFI Module [@db/sqlite](https://jsr.io/@db/sqlite) provides JavaScript bindings to the SQLite3 C API, using [Deno FFI](/runtime/reference/deno_namespace_apis/#ffi). ```ts import { Database } from "jsr:@db/sqlite@0.12"; const db = new Database("test.db"); const [version] = db.prepare("select sqlite_version()").value<[string]>()!; console.log(version); db.close(); ``` ### Connect to SQLite with the Wasm-Optimized Module [sqlite](https://deno.land/x/sqlite) is a SQLite module for JavaScript and TypeScript. The wrapper made specifically for Deno and uses a version of SQLite3 compiled to WebAssembly (Wasm). ```ts import { DB } from "https://deno.land/x/sqlite/mod.ts"; const db = new DB("test.db"); db.close(); ``` ## Firebase To connect to Firebase with Deno, import the [firestore npm module](https://firebase.google.com/docs/firestore/quickstart) with the [ESM CDN](https://esm.sh/). To learn more about using npm modules in Deno with a CDN, see [Using npm packages with CDNs](/runtime/fundamentals/modules/#https-imports). ### Connect to Firebase with the firestore npm module ```js import { initializeApp } from "https://www.gstatic.com/firebasejs/9.8.1/firebase-app.js"; import { addDoc, collection, connectFirestoreEmulator, deleteDoc, doc, Firestore, getDoc, getDocs, getFirestore, query, QuerySnapshot, setDoc, where, } from "https://www.gstatic.com/firebasejs/9.8.1/firebase-firestore.js"; import { getAuth } from "https://www.gstatic.com/firebasejs/9.8.1/firebase-auth.js"; const app = initializeApp({ apiKey: Deno.env.get("FIREBASE_API_KEY"), authDomain: Deno.env.get("FIREBASE_AUTH_DOMAIN"), projectId: Deno.env.get("FIREBASE_PROJECT_ID"), storageBucket: Deno.env.get("FIREBASE_STORAGE_BUCKET"), messagingSenderId: Deno.env.get("FIREBASE_MESSING_SENDER_ID"), appId: Deno.env.get("FIREBASE_APP_ID"), measurementId: Deno.env.get("FIREBASE_MEASUREMENT_ID"), }); const db = getFirestore(app); const auth = getAuth(app); ``` ## Supabase To connect to Supabase with Deno, import the [supabase-js npm module](https://supabase.com/docs/reference/javascript) with the [esm.sh CDN](https://esm.sh/). To learn more about using npm modules in Deno with a CDN, see [Using npm packages with CDNs](/runtime/fundamentals/modules/#https-imports). ### Connect to Supabase with the supabase-js npm module ```js import { createClient } from "https://esm.sh/@supabase/supabase-js"; const options = { schema: "public", headers: { "x-my-custom-header": "my-app-name" }, autoRefreshToken: true, persistSession: true, detectSessionInUrl: true, }; const supabase = createClient( "https://xyzcompany.supabase.co", "public-anon-key", options, ); ``` ## ORMs Object-Relational Mappings (ORM) define your data models as classes that you can persist to a database. You can read and write data in your database through instances of these classes. Deno supports multiple ORMs, including Prisma and DenoDB. ### DenoDB [DenoDB](https://deno.land/x/denodb) is a Deno-specific ORM. #### Connect to DenoDB ```ts import { Database, DataTypes, Model, PostgresConnector, } from "https://deno.land/x/denodb/mod.ts"; const connection = new PostgresConnector({ host: "...", username: "user", password: "password", database: "airlines", }); const db = new Database(connection); ``` ## GraphQL GraphQL is an API query language often used to compose disparate data sources into client-centric APIs. To set up a GraphQL API, you should first set up a GraphQL server. This server exposes your data as a GraphQL API that your client applications can query for data. ### Server You can use [gql](https://deno.land/x/gql), an universal GraphQL HTTP middleware for Deno, to run a GraphQL API server in Deno. #### Run a GraphQL API server with gql ```ts import { GraphQLHTTP } from "https://deno.land/x/gql/mod.ts"; import { makeExecutableSchema } from "https://deno.land/x/graphql_tools@0.0.2/mod.ts"; import { gql } from "https://deno.land/x/graphql_tag@0.0.1/mod.ts"; const typeDefs = gql` type Query { hello: String } `; const resolvers = { Query: { hello: () => `Hello World!`, }, }; const schema = makeExecutableSchema({ resolvers, typeDefs }); Deno.serve({ port: 3000 }, async () => { const { pathname } = new URL(req.url); return pathname === "/graphql" ? await GraphQLHTTP({ schema, graphiql: true, })(req) : new Response("Not Found", { status: 404 }); }); ``` ### Client To make GraphQL client calls in Deno, import the [graphql npm module](https://www.npmjs.com/package/graphql) with the [esm CDN](https://esm.sh/). To learn more about using npm modules in Deno via CDN read [here](/runtime/fundamentals/modules/#https-imports). #### Make GraphQL client calls with the graphql npm module ```js import { buildSchema, graphql } from "https://esm.sh/graphql"; const schema = buildSchema(` type Query { hello: String } `); const rootValue = { hello: () => { return "Hello world!"; }, }; const response = await graphql({ schema, source: "{ hello }", rootValue, }); console.log(response); ``` 🦕 Now you can connect your Deno project to a database you'll be able to work with persistent data, perform CRUD operations and start building more complex applications. --- # Build a React app with create-vite > A tutorial on building React applications with Deno and Vite. Learn how to set up a project, configure TypeScript, add API endpoints, implement routing, and deploy your React app using modern development tools. URL: https://docs.deno.com/examples/tutorials/create_react [React](https://reactjs.org) is the most widely used JavaScript frontend library. In this tutorial we'll build a simple React app with Deno. The app will display a list of dinosaurs. When you click on one, it'll take you to a dinosaur page with more details. You can see the [finished app repo on GitHub](https://github.com/denoland/tutorial-with-react) ![demo of the app](./images/how-to/react/react-dinosaur-app-demo.gif) ## Create a React app with Vite and Deno This tutorial will use [create-vite](https://vitejs.dev/) to quickly scaffold a Deno and React app. Vite is a build tool and development server for modern web projects. It pairs well with React and Deno, leveraging ES modules and allowing you to import React components directly. In your terminal run the following command to create a new React app with Vite using the typescript template: ```sh deno run -A npm:create-vite@latest --template react-ts ``` When prompted, give your app a name, and `cd` into the newly created project directory. Then run the following command to install the dependencies: ```sh deno install ``` Now you can serve your new react app by running: ```sh deno task dev ``` This will start the Vite server, click the output link to localhost to see your app in the browser. If you have the [Deno extension for VSCode](/runtime/getting_started/setup_your_environment/#visual-studio-code) installed, you may notice that the editor highlights some errors in the code. This is because the app created by Vite is designed with Node in mind and so uses conventions that Deno does not (such as 'sloppy imports' - importing modules without the file extension). Disable the Deno extension for this project to avoid these errors or try out the [tutorial to build a React app with a deno.json file](/runtime/tutorials/how_to_with_npm/react/). ## Add a backend The next step is to add a backend API. We'll create a very simple API that returns information about dinosaurs. In the root of your new project, create an `api` folder. In that folder, create a `main.ts` file, which will run the server, and a `data.json`, which will contain the hard coded dinosaur data. Copy and paste [this json file](https://github.com/denoland/tutorial-with-react/blob/main/api/data.json) into the `api/data.json` file. We're going to build out a simple API server with routes that return dinosaur information. We'll use the [`oak` middleware framework](https://jsr.io/@oak/oak) and the [`cors` middleware](https://jsr.io/@tajpouria/cors) to enable [CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS). Use the `deno add` command to add the required dependencies to your project: ```shell deno add jsr:@oak/oak jsr:@tajpouria/cors ``` Next, update `api/main.ts` to import the required modules and create a new `Router` instance to define some routes: ```ts title="main.ts" import { Application, Router } from "@oak/oak"; import { oakCors } from "@tajpouria/cors"; import data from "./data.json" with { type: "json" }; const router = new Router(); ``` After this, in the same file, we'll define two routes. One at `/api/dinosaurs` to return all the dinosaurs, and `/api/dinosaurs/:dinosaur` to return a specific dinosaur based on the name in the URL: ```ts title="main.ts" router.get("/api/dinosaurs", (context) => { context.response.body = data; }); router.get("/api/dinosaurs/:dinosaur", (context) => { if (!context?.params?.dinosaur) { context.response.body = "No dinosaur name provided."; } const dinosaur = data.find((item) => item.name.toLowerCase() === context.params.dinosaur.toLowerCase() ); context.response.body = dinosaur ?? "No dinosaur found."; }); ``` Finally, at the bottom of the same file, create a new `Application` instance and attach the routes we just defined to the application using `app.use(router.routes())` and start the server listening on port 8000: ```ts title="main.ts" const app = new Application(); app.use(oakCors()); app.use(router.routes()); app.use(router.allowedMethods()); await app.listen({ port: 8000 }); ``` You can run the API server with `deno run --allow-env --allow-net api/main.ts`. We'll create a task to run this command in the background and update the dev task to run both the React app and the API server. In your `package.json` file, update the `scripts` field to include the following: ```jsonc title="package.json" { "scripts": { "dev": "deno task dev:api & deno task dev:vite", "dev:api": "deno run --allow-env --allow-net api/main.ts", "dev:vite": "deno run -A npm:vite", // ... } ``` If you run `deno task dev` now and visit `localhost:8000/api/dinosaurs`, in your browser you should see a JSON response of all of the dinosaurs. ## React support in Deno At this point, your IDE or editor may be showing you warnings about missing types in your project. Deno has built-in TypeScript support for React applications. To enable this, you'll need to configure your project with the appropriate type definitions and DOM libraries. Create or update your `deno.json` file with the following TypeScript compiler options: ```jsonc title="deno.json" "compilerOptions": { "types": [ "react", "react-dom", "@types/react" ], "lib": [ "dom", "dom.iterable", "deno.ns" ], "jsx": "react-jsx", "jsxImportSource": "react" } ``` ## Update the entrypoint The entrypoint for the React app is in the `src/main.tsx` file. Ours is going to be very basic: ```tsx title="main.tsx" import { StrictMode } from "react"; import { createRoot } from "react-dom/client"; import "./index.css"; import App from "./App.tsx"; createRoot(document.getElementById("root")!).render( , ); ``` ## Add a router The app will have two routes: `/` and `/:dinosaur`. We'll use [`react-router-dom`](https://reactrouter.com/en/main) to build out some routing logic, so we'll need to add the `react-router-dom` dependency to your project. In the project root run: ```shell deno add npm:react-router-dom ``` Update the `/src/App.tsx` file to import and use the [`BrowserRouter`](https://reactrouter.com/en/main/router-components/browser-router) component from `react-router-dom` and define the two routes: ```tsx title="App.tsx" import { BrowserRouter, Route, Routes } from "react-router-dom"; import Index from "./pages/index.tsx"; import Dinosaur from "./pages/Dinosaur.tsx"; import "./App.css"; function App() { return ( } /> } /> ); } export default App; ``` ### Proxy to forward the api requests Vite will be serving the application on port `5173` while our api is running on port `8000`. Therefore, we'll need to set up a proxy to allow the `api/`-paths to get to be reachable by the router. Overwrite `vite.config.ts` with the following to configure a proxy: ```ts title="vite.config.ts" import { defineConfig } from "vite"; import react from "@vitejs/plugin-react"; export default defineConfig({ plugins: [react()], server: { proxy: { "/api": { target: "http://localhost:8000", changeOrigin: true, }, }, }, }); ``` ## Create the pages We'll create two pages: `Index` and `Dinosaur`. The `Index` page will list all the dinosaurs and the `Dinosaur` page will show details of a specific dinosaur. Create a `pages` folder in the `src` directory and inside that create two files: `index.tsx` and `Dinosaur.tsx`. ### Types Both pages will use the `Dino` type to describe the shape of data they're expecting from the API, so let's create a `types.ts` file in the `src` directory: ```ts title="types.ts" export type Dino = { name: string; description: string }; ``` ### index.tsx This page will fetch the list of dinosaurs from the API and render them as links: ```tsx title="index.tsx" import { useEffect, useState } from "react"; import { Link } from "react-router-dom"; import { Dino } from "../types.ts"; export default function Index() { const [dinosaurs, setDinosaurs] = useState([]); useEffect(() => { (async () => { const response = await fetch(`/api/dinosaurs/`); const allDinosaurs = await response.json() as Dino[]; setDinosaurs(allDinosaurs); })(); }, []); return (

Welcome to the Dinosaur app

Click on a dinosaur below to learn more.

{dinosaurs.map((dinosaur: Dino) => { return ( {dinosaur.name} ); })}
); } ``` ### Dinosaur.tsx This page will fetch the details of a specific dinosaur from the API and render it in a paragraph: ```tsx title="Dinosaur.tsx" import { useEffect, useState } from "react"; import { Link, useParams } from "react-router-dom"; import { Dino } from "../types.ts"; export default function Dinosaur() { const { selectedDinosaur } = useParams(); const [dinosaur, setDino] = useState({ name: "", description: "" }); useEffect(() => { (async () => { const resp = await fetch(`/api/dinosaurs/${selectedDinosaur}`); const dino = await resp.json() as Dino; setDino(dino); })(); }, [selectedDinosaur]); return (

{dinosaur.name}

{dinosaur.description}

🠠 Back to all dinosaurs
); } ``` ### Styling the list of dinosaurs Since we are displaying the list of dinosaurs on the main page, let's do some basic formatting. Add the following to the bottom of `src/App.css` to display our list of dinosaurs in an orderly fashion: ```css title="src/App.css" .dinosaur { display: block; } ``` ## Run the app To run the app use the task you set up earlier ```sh deno task dev ``` Navigate to the local Vite server in your browser (`localhost:5173`) and you should see the list of dinosaurs displayed which you can click through to find out about each one. ![demo of the app](./images/how-to/react/react-dinosaur-app-demo.gif) ## Build and deploy At this point the app is being served by the Vite development server. To serve the app in production, you can build the app with Vite and then serve the built files with Deno. To do so we'll need to update the api server to serve the built files. We'll write some middleware to do this. In your `api` directory create a new folder `util` and a new file called `routeStaticFilesFrom.ts` and add the following code: ```ts title="routeStaticFilesFrom.ts" import { Next } from "jsr:@oak/oak/middleware"; import { Context } from "jsr:@oak/oak/context"; // Configure static site routes so that we can serve // the Vite build output and the public folder export default function routeStaticFilesFrom(staticPaths: string[]) { return async (context: Context>, next: Next) => { for (const path of staticPaths) { try { await context.send({ root: path, index: "index.html" }); return; } catch { continue; } } await next(); }; } ``` This middleware will attempt to serve the static files from the paths provided in the `staticPaths` array. If the file is not found it will call the next middleware in the chain. We can now update the `api/main.ts` file to use this middleware: ```ts title="main.ts" import { Application, Router } from "@oak/oak"; import { oakCors } from "@tajpouria/cors"; import data from "./data.json" with { type: "json" }; import routeStaticFilesFrom from "./util/routeStaticFilesFrom.ts"; const router = new Router(); router.get("/api/dinosaurs", (context) => { context.response.body = data; }); router.get("/api/dinosaurs/:dinosaur", (context) => { if (!context?.params?.dinosaur) { context.response.body = "No dinosaur name provided."; } const dinosaur = data.find((item) => item.name.toLowerCase() === context.params.dinosaur.toLowerCase() ); context.response.body = dinosaur ? dinosaur : "No dinosaur found."; }); const app = new Application(); app.use(oakCors()); app.use(router.routes()); app.use(router.allowedMethods()); app.use(routeStaticFilesFrom([ `${Deno.cwd()}/dist`, `${Deno.cwd()}/public`, ])); await app.listen({ port: 8000 }); ``` Add a `serve` script to your `package.json` file to build the app with Vite and then run the API server: ```jsonc { "scripts": { // ... "serve": "deno task build && deno task dev:api", } ``` Now you can serve the built app with Deno by running: ```sh deno task serve ``` If you visit `localhost:8000` in your browser you should see the app running! 🦕 Now you can scaffold and develop a React app with Vite and Deno! You’re ready to build blazing-fast web applications. We hope you enjoy exploring these cutting-edge tools, we can't wait to see what you make! --- # Better debugging with the console API > An in-depth guide to advanced console debugging in Deno. Learn about console.table, timers, counters, tracers, and how to leverage the full console API beyond basic logging for better debugging workflows. URL: https://docs.deno.com/examples/tutorials/debugging_with_console Some of the console API is probably muscle memory for web developers, but there is so much more than just `console.log()` for you to use. Deno has great support for this API, so whether you’re writing JavaScript for the browser of for the server it’s worth learning about these helpful utilities. Let’s take a look at some of this API’s most useful methods. Your debugging is going to get so much easier! ## `console.log()` Hello, old friend! You’ll most likely be using this to output logging messages to the console to help you debug. ```js console.log("Hello, world!"); // "Hello, world!" ``` You can output multiple items by separated by commas like so: ```jsx const person = { "name": "Jane", "city": "New York" }; console.log("Hello, ", person.name, "from ", person.city); // "Hello, Jane from New York" ``` Or you can use string literals: ```jsx const person = { "name": "Jane", "city": "New York" }; console.log(`Hello ${person.name} from ${person.city}`); // "Hello, Jane from New York" ``` You can also [apply some styling using CSS](/examples/color_logging/) using the `%c` directive: ```jsx console.log("Wild %cblue", "color: blue", "yonder"); // Applies a blue text color to the word "blue" ``` …but there is much more you can do with the console API. ## `console.table()` The `table` method is helpful for outputting structured data like objects for easier inspection. ```jsx const people = { "john": { "age": 30, "city": "New York", }, "jane": { "age": 25, "city": "Los Angeles", }, }; console.table(people); /* ┌───────┬─────┬───────────────┐ │ (idx) │ age │ city │ ├───────┼─────┼───────────────┤ │ john │ 30 │ "New York" │ │ jane │ 25 │ "Los Angeles" │ └───────┴─────┴───────────────┘ */ ``` You can also specify the properties of your object that you’d like to include in the table. Great for inspecting a summary of those detailed objects to see just the part you are concerned with. ```jsx console.table(people, ["city"]); /* outputs ┌───────┬───────────────┐ │ (idx) │ city │ ├───────┼───────────────┤ │ john │ "New York" │ │ jane │ "Los Angeles" │ └───────┴───────────────┘ */ ``` ## Timer methods Understanding how long specific parts of your application take is key to removing performance bottlenecks and expensive operations. If you’ve ever reached for JavaScript’s date method to make yourself a timer, you’ll wish you’d know this one long ago. It’s more convenient and more accurate. Try using[`console.time()`](https://developer.mozilla.org/en-US/docs/Web/API/console/time_static), [`console.timeLog()`](https://developer.mozilla.org/en-US/docs/Web/API/console/timeLog_static), and [`console.timeEnd()`](https://developer.mozilla.org/en-US/docs/Web/API/console/timeEnd_static) instead. ```jsx console.time("My timer"); // starts a timer with label "My timer" // do some work... console.timeLog("My timer"); // outputs the current timer value, e.g. "My timer: 9000ms" // do more work... console.timeEnd("My timer"); // stops "My timer" and reports its value, e.g. "My timer: 97338ms" ``` You can create multiple timers each with their own label. Very handy! ## Counting things with `console.count()` It can be helpful to keep a count of how many times specific operations in your code have been executed. Rather than doing this manually you can use [`console.count()`](https://developer.mozilla.org/en-US/docs/Web/API/console/count_static) which can maintain multiple counters for you based on the label you provide. ```jsx // increment the default counter console.count(); console.count(); console.count(); /* "default: 1" "default: 2" "default: 3" */ ``` This can be very handy inside a function and passing in a label, like so: ```jsx function pat(animal) { console.count(animal); return `Patting the ${animal}`; } pat("cat"); pat("cat"); pat("dog"); pat("cat"); /* "cat: 1" "cat: 2" "dog: 1" "cat: 3" */ ``` ## Going deeper with `console.trace()` For a detailed view of what is happening in your application, you can output a stack trace to the console with [`console.trace()`](https://developer.mozilla.org/en-US/docs/Web/API/console/trace_static): ```jsx // main.js function foo() { function bar() { console.trace(); } bar(); } foo(); /* Trace at bar (file:///PATH_TO/main.js:3:13) at foo (file:///PATH_TO/main.js:5:3) at file:///PATH_TO/main.js:8:1 */ ``` There’s more to explore, but these handy methods can give your JavaScript debugging a boost and they are ready and waiting for you to use right in your browser or in your Deno application. Take a look at [console support](/api/web/~/Console) in the API Reference docs. for more. --- # Deploy an app with Deno Deploy > A step-by-step tutorial for deploying your first Deno application to Deno Deploy Early Access. URL: https://docs.deno.com/examples/tutorials/deno_deploy Deno Deploy allows you to host your Deno applications on a global edge network, with built in telemetry and CI/CD tooling. This tutorial guides you through creating and deploying a simple Deno application using Deno DeployEA. ## Prerequisites 1. A [GitHub](https://github.com) account 2. [Deno installed](https://docs.deno.com/runtime/manual/getting_started/installation) on your local machine 3. Access to the [Deno Deploy Early Access program](https://dash.deno.com/account#early-access) ## Create a simple Deno application with Vite First, let's create a basic application with Vite, initialize a new [Vite](https://vite.dev/guide/) project: ```sh deno init --npm vite ``` Give your project a name and select your framework and variant. For this tutorial, we'll create a vanilla TypeScript app. Change directory to your newly created project name with `cd my-project-name` then run: ```sh deno install deno run dev ``` You should see a basic app running at [http://127.0.0.1:5173/](http://127.0.0.1:5173/). You can edit the `main.ts` file to see changes in the browser. ## Create a GitHub repository 1. Go to [GitHub](https://github.com) and create a new repository. 2. Initialize your local directory as a Git repository: ```sh git init git add . git commit -m "Initial commit" ``` 3. Add your GitHub repository as a remote and push your code: ```sh git remote add origin https://github.com/your-username/my-first-deno-app.git git branch -M main git push -u origin main ``` ## Sign up for Deno Deploy Early Access 1. Visit the [Deno Deploy account settings](https://dash.deno.com/account#early-access) 2. Click "Join the Early Access program" 3. Once approved, you'll receive an email with access instructions ![Early access joining screenshot](./images/join.png) ## Create a Deno Deploy organization 1. Navigate to [app.deno.com](https://app.deno.com) 2. Click "+ New Organization" 3. Select the 'Standard Deploy' organization type 4. Enter an organization name and slug (this cannot be changed later) 5. Click "Create Standard Deploy organization" ## Create and deploy your application 1. From your organization's dashboard, click "Try new Deno Deploy Early Access​" 2. Then click "+ New App" 3. Select the GitHub repository you created earlier 4. The app configuration should be automatically detected, but you can verify these settings blu clicking the "Edit build config" button: - Framework preset: No preset - Runtime configuration: Static Site - Install command: `deno install` - Build command: `deno task build` - Static Directory: `dist` 5. Click "Create App" to start the deployment process ## Monitor your deployment 1. Watch the build logs as your application is deployed 2. Once deployment completes, you'll see a preview URL (typically `https://your-app-name.your-org-name.deno.net`) 3. Click the URL to view your deployed application! ## Make changes and redeploy Let's update the application and see how changes are deployed: Update your `main.ts` file locally: ```ts title="main.ts" import './style.css' import typescriptLogo from './typescript.svg' import viteLogo from '/vite.svg' import { setupCounter } from './counter.ts' document.querySelector('#app')!.innerHTML = `

Hello from Deno Deploy!

Click on the Vite and TypeScript logos to learn more

setupCounter(document.querySelector('#counter')!) ``` 2. Commit and push your changes: ```sh git add . git commit -m "Update application" git push ``` Return to your Deno Deploy dashboard to see a new build automatically start. Once the build completes, visit your application URL to see the update. ## Explore observability features Deno DeployEA provides comprehensive observability tools: 1. From your application dashboard, click "Logs" in the sidebar - You'll see console output from your application - Use the search bar to filter logs (e.g., `context:production`) 2. Click "Traces" to view request traces - Select a trace to see detailed timing information - Examine spans to understand request processing 3. Click "Metrics" to view application performance metrics - Monitor request counts, error rates, and response times 🦕 Now that you've deployed your first application, you might want to: 1. [Add a custom domain](/deploy/early-access/reference/domains/) to your application 2. Explore [framework support](/deploy/early-access/reference/frameworks/) for Next.js, Astro, and other frameworks 3. Learn about [caching strategies](/deploy/early-access/reference/caching/) to improve performance 4. Set up different [environments](/deploy/early-access/reference/env-vars-and-contexts/) for development and production For more information, check out the [Deno DeployEA Reference documentation](/deploy/early-access/reference/). --- # Monitor your app with OpenTelemetry and Deno Deploy > A step-by-step tutorial for adding custom OpenTelemetry instrumentation to your Deno Deploy application. URL: https://docs.deno.com/examples/tutorials/deploy_otel Deno DeployEA includes built-in OpenTelemetry support that automatically captures traces for HTTP requests, database queries, and other operations. This tutorial shows how to add custom OpenTelemetry instrumentation to your applications for more detailed observability. ## Prerequisites 1. A [GitHub](https://github.com) account 2. [Deno installed](https://docs.deno.com/runtime/manual/getting_started/installation) on your local machine 3. Access to the [Deno Deploy Early Access program](https://dash.deno.com/account#early-access) 4. Basic familiarity with [OpenTelemetry concepts](https://opentelemetry.io/docs/concepts/) ## Create a basic API application First, let's create a simple API server that we'll instrument with OpenTelemetry: ```ts title="main.ts" const dataStore: Record = {}; async function handler(req: Request): Promise { const url = new URL(req.url); // Simulate random latency await new Promise((resolve) => setTimeout(resolve, Math.random() * 200)); try { // Handle product listing if (url.pathname === "/products" && req.method === "GET") { return new Response(JSON.stringify(Object.values(dataStore)), { headers: { "Content-Type": "application/json" }, }); } // Handle product creation if (url.pathname === "/products" && req.method === "POST") { const data = await req.json(); const id = crypto.randomUUID(); dataStore[id] = data; return new Response(JSON.stringify({ id, ...data }), { status: 201, headers: { "Content-Type": "application/json" }, }); } // Handle product retrieval by ID if (url.pathname.startsWith("/products/") && req.method === "GET") { const id = url.pathname.split("/")[2]; const product = dataStore[id]; if (!product) { return new Response("Product not found", { status: 404 }); } return new Response(JSON.stringify(product), { headers: { "Content-Type": "application/json" }, }); } // Handle root route if (url.pathname === "/") { return new Response("Product API - Try /products endpoint"); } return new Response("Not Found", { status: 404 }); } catch (error) { console.error("Error handling request:", error); return new Response("Internal Server Error", { status: 500 }); } } console.log("Server running on http://localhost:8000"); Deno.serve(handler, { port: 8000 }); ``` Save this file and run it locally: ```sh deno run --allow-net main.ts ``` Test the API with curl or a browser to ensure it works: ```sh # List products (empty at first) curl http://localhost:8000/products # Add a product curl -X POST http://localhost:8000/products \ -H "Content-Type: application/json" \ -d '{"name": "Test Product", "price": 19.99}' ``` ## Add OpenTelemetry instrumentation Now, let's add custom OpenTelemetry instrumentation to our application. Create a new file called `instrumented-main.ts`: ```ts title="instrumented-main.ts" import { trace } from "npm:@opentelemetry/api@1"; // Get the OpenTelemetry tracer const tracer = trace.getTracer("product-api"); const dataStore: Record = {}; // Simulate a database operation with custom span async function queryDatabase( operation: string, data?: unknown, ): Promise { return await tracer.startActiveSpan(`database.${operation}`, async (span) => { try { // Add attributes to the span for better context span.setAttributes({ "db.system": "memory-store", "db.operation": operation, }); // Simulate database latency const delay = Math.random() * 100; await new Promise((resolve) => setTimeout(resolve, delay)); // Add latency information to the span span.setAttributes({ "db.latency_ms": delay }); if (operation === "list") { return Object.values(dataStore); } else if (operation === "get") { return dataStore[data as string]; } else if (operation === "insert") { const id = crypto.randomUUID(); dataStore[id] = data as string; return { id, data }; } return null; } catch (error) { // Record any errors to the span span.recordException(error); span.setStatus({ code: trace.SpanStatusCode.ERROR }); throw error; } finally { // End the span when we're done span.end(); } }); } async function handler(req: Request): Promise { // Create a parent span for the entire request return await tracer.startActiveSpan( `${req.method} ${new URL(req.url).pathname}`, async (parentSpan) => { const url = new URL(req.url); // Add request details as span attributes parentSpan.setAttributes({ "http.method": req.method, "http.url": req.url, "http.route": url.pathname, }); try { // Handle product listing if (url.pathname === "/products" && req.method === "GET") { const products = await queryDatabase("list"); return new Response(JSON.stringify(products), { headers: { "Content-Type": "application/json" }, }); } // Handle product creation if (url.pathname === "/products" && req.method === "POST") { // Create a span for parsing request JSON const data = await tracer.startActiveSpan( "parse.request.body", async (span) => { try { const result = await req.json(); return result; } catch (error) { span.recordException(error); span.setStatus({ code: trace.SpanStatusCode.ERROR }); throw error; } finally { span.end(); } }, ); const result = await queryDatabase("insert", data); return new Response(JSON.stringify(result), { status: 201, headers: { "Content-Type": "application/json" }, }); } // Handle product retrieval by ID if (url.pathname.startsWith("/products/") && req.method === "GET") { const id = url.pathname.split("/")[2]; parentSpan.setAttributes({ "product.id": id }); const product = await queryDatabase("get", id); if (!product) { parentSpan.setAttributes({ "error": true, "error.type": "not_found", }); return new Response("Product not found", { status: 404 }); } return new Response(JSON.stringify(product), { headers: { "Content-Type": "application/json" }, }); } // Handle root route if (url.pathname === "/") { return new Response("Product API - Try /products endpoint"); } parentSpan.setAttributes({ "error": true, "error.type": "not_found" }); return new Response("Not Found", { status: 404 }); } catch (error) { console.error("Error handling request:", error); // Record the error in the span parentSpan.recordException(error); parentSpan.setAttributes({ "error": true, "error.type": error.name, "error.message": error.message, }); parentSpan.setStatus({ code: trace.SpanStatusCode.ERROR }); return new Response("Internal Server Error", { status: 500 }); } finally { // End the parent span when we're done parentSpan.end(); } }, ); } console.log( "Server running with OpenTelemetry instrumentation on http://localhost:8000", ); Deno.serve(handler, { port: 8000 }); ``` Run the instrumented version locally: ```sh deno run --allow-net instrumented-main.ts ``` Test the API again with curl to generate some traces. ## Create a GitHub repository 1. Go to [GitHub](https://github.com) and create a new repository. 2. Initialize your local directory as a Git repository: ```sh git init git add . git commit -m "Add OpenTelemetry instrumented API" ``` 3. Add your GitHub repository as a remote and push your code: ```sh git remote add origin https://github.com/your-username/otel-demo-app.git git branch -M main git push -u origin main ``` ## Deploy to Deno Deploy Early Access 1. Navigate to [app.deno.com](https://app.deno.com) 2. Select your organization or create a new one if needed 3. Click "+ New App" 4. Select the GitHub repository you created earlier 5. Configure the build settings: - Framework preset: No preset - Runtime configuration: Dynamic - Entrypoint: `instrumented-main.ts` 6. Click "Create App" to start the deployment process ## Generate sample traffic To generate sample traces and metrics, let's send some traffic to your deployed application: 1. Copy your deployment URL from the Deno Deploy dashboard 2. Send several requests to different endpoints: ```sh # Store your app URL in a variable APP_URL=https://your-app-name.your-org-name.deno.net # Get the root route curl $APP_URL/ # List products (empty at first) curl $APP_URL/products # Add some products curl -X POST $APP_URL/products -H "Content-Type: application/json" -d '{"name": "Laptop", "price": 999.99}' curl -X POST $APP_URL/products -H "Content-Type: application/json" -d '{"name": "Headphones", "price": 129.99}' curl -X POST $APP_URL/products -H "Content-Type: application/json" -d '{"name": "Mouse", "price": 59.99}' # List products again curl $APP_URL/products # Try to access a non-existent product (will generate an error span) curl $APP_URL/products/nonexistent-id ``` ## Explore OpenTelemetry traces and metrics Now let's explore the observability data collected by Deno Deploy: 1. From your application dashboard, click "Traces" in the sidebar - You'll see a list of traces for each request to your application - You can filter traces by HTTP method or status code using the search bar 2. Select one of your `/products` POST traces to see detailed information: - The parent span for the entire request - Child spans for database operations - The span for parsing the request body ![Trace waterfall view](./images/early-access/otel_trace.png) 3. Click on individual spans to see their details: - Duration and timing information - Attributes you set like `db.operation` and `db.latency_ms` - Any recorded exceptions 4. Click "Logs" in the sidebar to see console output with trace context: - Notice how logs emitted during a traced operation are automatically linked to the trace - Click "View trace" on a log line to see the associated trace 5. Click "Metrics" to view application performance metrics: - HTTP request counts by endpoint - Error rates - Response time distributions 🦕 The automatic instrumentation in Deno DeployEA combined with your custom instrumentation provides comprehensive visibility into your application's performance and behavior. For more information about OpenTelemetry in Deno, check out these resources: - [OpenTelemetry in Deno documentation](/runtime/fundamentals/open_telemetry/) - [Deno DeployEA Observability reference](/deploy/early-access/reference/observability/) - [OpenTelemetry official documentation](https://opentelemetry.io/docs/) --- # How to deploy Deno to Digital Ocean > A step-by-step guide to deploying Deno applications on Digital Ocean. Learn about Docker containerization, GitHub Actions automation, container registries, and how to set up continuous deployment workflows. URL: https://docs.deno.com/examples/tutorials/digital_ocean Digital Ocean is a popular cloud infrastructure provider offering a variety of hosting services ranging from networking, to compute, to storage. Here's a step by step guide to deploying a Deno app to Digital Ocean using Docker and GitHub Actions. The pre-requisites for this are: - [`docker` CLI](https://docs.docker.com/engine/reference/commandline/cli/) - a [GitHub account](https://github.com) - a [Digital Ocean account](https://digitalocean.com) - [`doctl` CLI](https://docs.digitalocean.com/reference/doctl/how-to/install/) ## Create Dockerfile and docker-compose.yml To focus on the deployment, our app will simply be a `main.ts` file that returns a string as an HTTP response: ```ts title="main.ts" import { Application } from "jsr:@oak/oak"; const app = new Application(); app.use((ctx) => { ctx.response.body = "Hello from Deno and Digital Ocean!"; }); await app.listen({ port: 8000 }); ``` Then, we'll create two files -- `Dockerfile` and `docker-compose.yml` -- to build the Docker image. In our `Dockerfile`, let's add: ```Dockerfile title="Dockerfile" FROM denoland/deno EXPOSE 8000 WORKDIR /app ADD . /app RUN deno install --entrypoint main.ts CMD ["run", "--allow-net", "main.ts"] ``` Then, in our `docker-compose.yml`: ```yml version: "3" services: web: build: . container_name: deno-container image: deno-image ports: - "8000:8000" ``` Let's test this locally by running `docker compose -f docker-compose.yml build`, then `docker compose up`, and going to `localhost:8000`. ![Hello from localhost](./images/how-to/digital-ocean/hello-world-from-localhost.png) It works! ## Build, Tag, and Push your Docker image to Digital Ocean Container Registry Digital Ocean has its own private Container Registry, with which we can push and pull Docker images. In order to use this registry, let's [install and authenticate `doctl` on the command line](https://docs.digitalocean.com/reference/doctl/how-to/install/). After that, we'll create a new private registry named `deno-on-digital-ocean`: ```shell doctl registry create deno-on-digital-ocean ``` Using our Dockerfile and docker-compose.yml, we'll build a new image, tag it, and push it to the registry. Note that `docker-compose.yml` will name the build locally as `deno-image`. ```shell docker compose -f docker-compose.yml build ``` Let's [tag](https://docs.docker.com/engine/reference/commandline/tag/) it with `new`: ```shell docker tag deno-image registry.digitalocean.com/deno-on-digital-ocean/deno-image:new ``` Now we can push it to the registry. ```shell docker push registry.digitalocean.com/deno-on-digital-ocean/deno-image:new ``` You should see your new `deno-image` with the `new` tag in your [Digital Ocean container registry](https://cloud.digitalocean.com/registry): ![New deno image on Digital Ocean container registry](./images/how-to/digital-ocean/new-deno-image-on-digital-ocean-container-registry.png) Perfect! ## Deploy to Digital Ocean via SSH Once our `deno-image` is in the registry, we can run it anywhere using `docker run`. In this case, we'll run it while in our [Digital Ocean Droplet](https://www.digitalocean.com/products/droplets), their hosted virtual machine. While on your [Droplet page](https://cloud.digitalocean.com/droplets), click on your Droplet and then `console` to SSH into the virtual machine. (Or you can [ssh directly from your command line](https://docs.digitalocean.com/products/droplets/how-to/connect-with-ssh/).) To pull down the `deno-image` image and run it, let's run: ```shell docker run -d --restart always -it -p 8000:8000 --name deno-image registry.digitalocean.com/deno-on-digital-ocean/deno-image:new ``` Using our browser to go to the Digital Ocean address, we now see: ![Hello from Deno and Digital Ocean](./images/how-to/digital-ocean/hello-from-deno-and-digital-ocean.png) Boom! ## Automate the Deployment via GitHub Actions Let's automate that entire process with GitHub actions. First, let's get all of our environmental variables needed for logging into `doctl` and SSHing into the Droplet: - [DIGITALOCEAN_ACCESS_TOKEN](https://docs.digitalocean.com/reference/api/create-personal-access-token/) - DIGITALOCEAN_HOST (the IP address of your Droplet) - DIGITALOCEAN_USERNAME (the default is `root`) - DIGITALOCEAN_SSHKEY (more on this below) ### Generate `DIGITALOCEAN_SSHKEY` The `DIGITALOCEAN_SSHKEY` is a private key where its public counterpart exists on the virtual machine in its `~/.ssh/authorized_keys` file. To do this, first let's run `ssh-keygen` on your local machine: ```shell ssh-keygen ``` When prompted for an email, **be sure to use your GitHub email** for the GitHub Action to authenticate properly. Your final output should look something like this: ```console Output Your identification has been saved in /your_home/.ssh/id_rsa Your public key has been saved in /your_home/.ssh/id_rsa.pub The key fingerprint is: SHA256:/hk7MJ5n5aiqdfTVUZr+2Qt+qCiS7BIm5Iv0dxrc3ks user@host The key's randomart image is: +---[RSA 3072]----+ | .| | + | | + | | . o . | |o S . o | | + o. .oo. .. .o| |o = oooooEo+ ...o| |.. o *o+=.*+o....| | =+=ooB=o.... | +----[SHA256]-----+ ``` Next, we'll have to upload the newly generated public key to your Droplet. You can either use [`ssh-copy-id`](https://www.ssh.com/academy/ssh/copy-id) or manually copy it, ssh into your Droplet, and pasting it to `~/.ssh/authorized_keys`. Using `ssh-copy-id`: ```shell ssh-copy-id {{ username }}@{{ host }} ``` This command will prompt you for the password. Note that this will automatically copy `id_rsa.pub` key from your local machine and paste it to your Droplet's `~/.ssh/authorized_keys` file. If you've named your key something other than `id_rsa`, you can pass it with the `-i` flag to the command: ```shell ssh-copy-id -i ~/.ssh/mykey {{ username }}@{{ host }} ``` To test whether this is done successfully: ```shell ssh -i ~/.ssh/mykey {{ username }}@{{ host }} ``` Awesome! ### Define the yml File The final step is to put this all together. We're basically taking each step during the manual deployment and adding them to a GitHub Actions workflow yml file: ```yml name: Deploy to Digital Ocean on: push: branches: - main env: REGISTRY: "registry.digitalocean.com/deno-on-digital-ocean" IMAGE_NAME: "deno-image" jobs: build_and_push: name: Build, Push, and Deploy runs-on: ubuntu-latest steps: - name: Checkout main uses: actions/checkout@v4 - name: Set $TAG from shortened sha run: echo "TAG=`echo ${GITHUB_SHA} | cut -c1-8`" >> $GITHUB_ENV - name: Build container image run: docker compose -f docker-compose.yml build - name: Tag container image run: docker tag ${{ env.IMAGE_NAME }} ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ env.TAG }} - name: Install `doctl` uses: digitalocean/action-doctl@v2 with: token: ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }} - name: Log in to Digital Ocean Container Registry run: doctl registry login --expiry-seconds 600 - name: Push image to Digital Ocean Container Registry run: docker push ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ env.TAG }} - name: Deploy via SSH uses: appleboy/ssh-action@master with: host: ${{ secrets.DIGITALOCEAN_HOST }} username: ${{ secrets.DIGITALOCEAN_USERNAME }} key: ${{ secrets.DIGITALOCEAN_SSHKEY }} script: | # Login to Digital Ocean Container Registry docker login -u ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }} -p ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }} registry.digitalocean.com # Stop and remove a running image. docker stop ${{ env.IMAGE_NAME }} docker rm ${{ env.IMAGE_NAME }} # Run a new container from a new image docker run -d --restart always -it -p 8000:8000 --name ${{ env.IMAGE_NAME }} ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ env.TAG }} ``` When you push to GitHub, this yml file is automatically detected, triggering the Deploy action. --- # Build a Database App with Drizzle ORM and Deno > Step-by-step guide to building database applications with Drizzle ORM and Deno. Learn about schema management, type-safe queries, PostgreSQL integration, migrations, and how to implement CRUD operations. URL: https://docs.deno.com/examples/tutorials/drizzle [Drizzle ORM](https://orm.drizzle.team/) is a TypeScript ORM that provides a type-safe way to interact with your database. In this tutorial, we'll set up Drizzle ORM with Deno and PostgreSQL to create, read, update, and delete dinosaur data: - [Install Drizzle](#install-drizzle) - [Configure Drizzle](#configure-drizzle) - [Define schemas](#define-schemas) - [Interact with the database](#interact-with-the-database) - [What's next?](#whats-next) You can find all the code for this tutorial in [this GitHub repo](https://github.com/denoland/examples/tree/main/with-drizzle). ## Install Drizzle First, we'll install the required dependencies using Deno's npm compatibility. We'll be using Drizzle with [Postgres](https://orm.drizzle.team/docs/get-started-postgresql), but you can also use [MySQL](https://orm.drizzle.team/docs/get-started-mysql) or [SQLite](https://orm.drizzle.team/docs/get-started-sqlite). (If you don't have Postgres, you can [install it here](https://www.postgresql.org/download/).) ```bash deno install npm:drizzle-orm npm:drizzle-kit npm:pg npm:@types/pg ``` This installs Drizzle ORM and its associated tools — [drizzle-kit](https://orm.drizzle.team/docs/kit-overview) for schema migrations, [pg](https://www.npmjs.com/package/pg) for PostgreSQL connectivity, and [the TypeScript types for PostgreSQL](https://www.npmjs.com/package/@types/pg). These packages will allow us to interact with our database in a type-safe way while maintaining compatibility with Deno's runtime environment. It will also create a `deno.json` file in your project root to manage the npm dependencies: ```json { "imports": { "@types/pg": "npm:@types/pg@^8.11.10", "drizzle-kit": "npm:drizzle-kit@^0.27.2", "drizzle-orm": "npm:drizzle-orm@^0.36.0", "pg": "npm:pg@^8.13.1" } } ``` ## Configure Drizzle Next, let's create a `drizzle.config.ts` file in your project root. This file will configure Drizzle to work with your PostgreSQL database: ```tsx import { defineConfig } from "drizzle-kit"; export default defineConfig({ out: "./drizzle", schema: "./src/db/schema.ts", dialect: "postgresql", dbCredentials: { url: Deno.env.get("DATABASE_URL")!, }, }); ``` These config settings determine: - where to output migration files (`./drizzle`) - where to find your schema definition (`./src/db/schema.ts`) - that PostgreSQL as your database dialect, and - how to connect to your database using the URL stored in your environment variables The `drizzle-kit` will use this configuration to manage your database schema and generate SQL migrations automatically. We’ll also need a `.env` file in the project root containing the `DATABASE_URL` connection string: ```bash DATABASE_URL=postgresql://[user[:password]@][host][:port]/[dbname] ``` Be sure to replace the login credentials with yours. Next, let's connect to the database and use Drizzle to populate our tables. ## Define schemas There are two ways that you can define your table schema with Drizzle. If you already have Postgres tables defined, you can infer them with `pull`; otherwise, you can define them in code, then use Drizzle to create a new table. We'll explore both approaches below. ### Infer schema with `pull` If you already have Postgres tables before adding Drizzle, then you can introspect your database schema to automatically generate TypeScript types and table definitions with the command [`npm:drizzle-kit pull`](https://orm.drizzle.team/docs/drizzle-kit-pull). This is particularly useful when working with an existing database or when you want to ensure your code stays in sync with your database structure. Let's say our current database already has the following table schemas: ![Diagram of table schema in postgres](./images/how-to/drizzle/table-diagram.png) We'll run the following command to instrospect the database and populate several files under a `./drizzle` directory:
```bash deno --env -A --node-modules-dir npm:drizzle-kit pull Failed to find Response internal state key No config path provided, using default 'drizzle.config.ts' Reading config file '/private/tmp/deno-drizzle-example/drizzle.config.ts' Pulling from ['public'] list of schemas Using 'pg' driver for database querying [✓] 2 tables fetched [✓] 8 columns fetched [✓] 0 enums fetched [✓] 0 indexes fetched [✓] 1 foreign keys fetched [✓] 0 policies fetched [✓] 0 check constraints fetched [✓] 0 views fetched [i] No SQL generated, you already have migrations in project [✓] You schema file is ready ➜ drizzle/schema.ts 🚀 [✓] You relations file is ready ➜ drizzle/relations.ts 🚀 ```
We use the --env flag to read the .env file with our database url and the --node-modules-dir flag to create a node_modules folder that will allow us to use drizzle-kit correctly.

The above command will create a number of files within a `./drizzle` directory that define the schema, track changes, and provide the necessary information for database migrations: - `drizzle/schema.ts`: This file defines the database schema using Drizzle ORM's schema definition syntax. - `drizzle/relations.ts`: This file is intended to define relationships between tables using Drizzle ORM's relations API. - `drizzle/0000_long_veda.sql`: A SQL migration file that contains the SQL code to create the database table(s). The code is commented out — you can uncomment this code if you want to run this migration to create the table(s) in a new environment. - `drizzle/meta/0000_snapshot.json`: A snapshot file that represents the current state of your database schema. - `drizzle/meta/_journal.json`: This file keeps track of the migrations that have been applied to your database. It helps Drizzle ORM know which migrations have been run and which ones still need to be applied. ### Define schema in Drizzle first If you don't already have an existing table defined in Postgres (e.g. you're starting a completely new project), you can define the tables and types in code and have Drizzle create them. Let's create a new directory `./src/db/` and in it, a `schema.ts` file, which we'll populate with the below:
```ts // schema.ts import { boolean, foreignKey, integer, pgTable, serial, text, timestamp, } from "drizzle-orm/pg-core"; export const dinosaurs = pgTable("dinosaurs", { id: serial().primaryKey().notNull(), name: text(), description: text(), }); export const tasks = pgTable("tasks", { id: serial().primaryKey().notNull(), dinosaurId: integer("dinosaur_id"), description: text(), dateCreated: timestamp("date_created", { mode: "string" }).defaultNow(), isComplete: boolean("is_complete"), }, (table) => { return { tasksDinosaurIdFkey: foreignKey({ columns: [table.dinosaurId], foreignColumns: [dinosaurs.id], name: "tasks_dinosaur_id_fkey", }), }; }); ```
The above represents in code the two tables, dinosaurs and tasks and their relation. Learn more about using Drizzle to define schemas and their relations.

Once we have defined `./src/db/schema.ts`, we can create the tables and their specified relationship by creating a migration: ```bash deno -A --node-modules-dir npm:drizzle-kit generate Failed to find Response internal state key No config path provided, using default 'drizzle.config.ts' Reading config file '/private/tmp/drizzle/drizzle.config.ts' 2 tables dinosaurs 3 columns 0 indexes 0 fks tasks 5 columns 0 indexes 1 fks ``` The above command will create a `./drizzle/` folder that contains migration scripts and logs. ## Interact with the database Now that we have setup Drizzle ORM, we can use it to simplify managing data in our Postgres database. First, Drizzle suggests taking the `schema.ts` and `relations.ts` and copying them to the `./src/db` directory to use within an application. Let's create a `./src/db/db.ts` which exports a few helper functions that'll make it easier for us to interact with the database: ```ts import { drizzle } from "drizzle-orm/node-postgres"; import { dinosaurs as dinosaurSchema, tasks as taskSchema } from "./schema.ts"; import { dinosaursRelations, tasksRelations } from "./relations.ts"; import pg from "pg"; import { integer } from "drizzle-orm/sqlite-core"; import { eq } from "drizzle-orm/expressions"; // Use pg driver. const { Pool } = pg; // Instantiate Drizzle client with pg driver and schema. export const db = drizzle({ client: new Pool({ connectionString: Deno.env.get("DATABASE_URL"), }), schema: { dinosaurSchema, taskSchema, dinosaursRelations, tasksRelations }, }); // Insert dinosaur. export async function insertDinosaur(dinosaurObj: typeof dinosaurSchema) { return await db.insert(dinosaurSchema).values(dinosaurObj); } // Insert task. export async function insertTask(taskObj: typeof taskSchema) { return await db.insert(taskSchema).values(taskObj); } // Find dinosaur by id. export async function findDinosaurById(dinosaurId: typeof integer) { return await db.select().from(dinosaurSchema).where( eq(dinosaurSchema.id, dinosaurId), ); } // Find dinosaur by name. export async function findDinosaurByName(name: string) { return await db.select().from(dinosaurSchema).where( eq(dinosaurSchema.name, name), ); } // Find tasks based on dinosaur id. export async function findDinosaurTasksByDinosaurId( dinosaurId: typeof integer, ) { return await db.select().from(taskSchema).where( eq(taskSchema.dinosaurId, dinosaurId), ); } // Update dinosaur. export async function updateDinosaur(dinosaurObj: typeof dinosaurSchema) { return await db.update(dinosaurSchema).set(dinosaurObj).where( eq(dinosaurSchema.id, dinosaurObj.id), ); } // Update task. export async function updateTask(taskObj: typeof taskSchema) { return await db.update(taskSchema).set(taskObj).where( eq(taskSchema.id, taskObj.id), ); } // Delete dinosaur by id. export async function deleteDinosaurById(id: typeof integer) { return await db.delete(dinosaurSchema).where( eq(dinosaurSchema.id, id), ); } // Delete task by id. export async function deleteTask(id: typeof integer) { return await db.delete(taskSchema).where(eq(taskSchema.id, id)); } ``` Now we can import some of these helper functions to a script where we can perform some simple CRUD operations on our database. Let's create a new file `./src/script.ts`: ```ts import { deleteDinosaurById, findDinosaurByName, insertDinosaur, insertTask, updateDinosaur, } from "./db/db.ts"; // Create a new dinosaur. await insertDinosaur({ name: "Denosaur", description: "Dinosaurs should be simple.", }); // Find that dinosaur by name. const res = await findDinosaurByName("Denosaur"); // Create a task with that dinosaur by its id. await insertTask({ dinosaurId: res.id, description: "Remove unnecessary config.", isComplete: false, }); // Update a dinosaur with a new description. const newDeno = { id: res.id, name: "Denosaur", description: "The simplest dinosaur.", }; await updateDinosaur(newDeno); // Delete the dinosaur (and any tasks it has). await deleteDinosaurById(res.id); ``` We can run it and it will perform all of the actions on the database: ```ts deno -A --env ./src/script.ts ``` ## What's next? Drizzle ORM is a popular data mapping tool to simplify managing and maintaining data models and working with your database. Hopefully, this tutorial gives you a start on how to use Drizzle in your Deno projects. Now that you have a basic understanding of how to use Drizzle ORM with Deno, you could: 1. Add more complex database relationships 2. [Implement a REST API](https://docs.deno.com/examples/) using [Hono](https://jsr.io/@hono/hono) to serve your dinosaur data 3. Add validation and error handling to your database operations 4. Write tests for your database interactions 5. [Deploy your application to the cloud](https://docs.deno.com/runtime/tutorials/#deploying-deno-projects) 🦕 Happy coding with Deno and Drizzle ORM! The type-safety and simplicity of this stack make it a great choice for building modern web applications. --- # How to use Express with Deno > Step-by-step guide to using Express.js with Deno. Learn how to set up an Express server, configure routes, handle middleware, and build REST APIs using Deno's Node.js compatibility features. URL: https://docs.deno.com/examples/tutorials/express [Express](https://expressjs.com/) is a popular web framework known for being simple and unopinionated with a large ecosystem of middleware. This How To guide will show you how to create a simple API using Express and Deno. [View source here.](https://github.com/denoland/tutorial-with-express) ## Initialize a new deno project In your commandline run the command to create a new starter project, then navigate into the project directory: ```sh deno init my-express-project cd my-express-project ``` ## Install Express To install Express, we'll use the `npm:` module specifier. This specifier allows us to import modules from npm: ```sh deno add npm:express ``` This will add the latest `express` package to the `imports` field in your `deno.json` file. Now you can import `express` in your code with `import express from "express";`. ## Update `main.ts` In the `main.ts`, let's create a simple server: ```ts import express from "express"; const app = express(); app.get("/", (req, res) => { res.send("Welcome to the Dinosaur API!"); }); app.listen(8000); console.log(`Server is running on http://localhost:8000`); ``` You may notice that your editor is complaining about the `req` and `res` parameters. This is because Deno does not have types for the `express` module. To fix this, you can import the Express types file directly from npm. Add the following comment to the top of your `main.ts` file: ```ts // @ts-types="npm:@types/express@4.17.15" ``` This comment tells Deno to use the types from the `@types/express` package. ## Run the server When you initialized the project, Deno set up a task which will run the main.ts file, you can see it in the `deno.json` file. Update the `dev` task to include the [`--allow-net`](/runtime/fundamentals/security/#network-access) flag: ````jsonc { "scripts": { "dev": "deno run --allow-net main.ts" }, ... } This will allow the project to make network requests. You can [read more about permissions flags](/runtime/fundamentals/security/). Now you can run the server with: ```sh deno run dev ```` If you visit `localhost:8000` in your browser, you should see: **Welcome to the Dinosaur API!** ## Add data and routes The next step here is to add some data. We'll use this Dinosaur data that we found from [this article](https://www.thoughtco.com/dinosaurs-a-to-z-1093748). Feel free to [copy it from here](https://raw.githubusercontent.com/denoland/tutorial-with-express/refs/heads/main/data.json). Create a `data.json` file in the root of your project, and paste in the dinosaur data. Next, we'll import that data into `main.ts`: ```ts import data from "./data.json" with { type: "json" }; ``` We will create the routes to access that data. To keep it simple, let's just define `GET` handlers for `/api/` and `/api/:dinosaur`. Add the following code after the `const app = express();` line: ```ts app.get("/", (req, res) => { res.send("Welcome to the Dinosaur API!"); }); app.get("/api", (req, res) => { res.send(data); }); app.get("/api/:dinosaur", (req, res) => { if (req?.params?.dinosaur) { const found = data.find((item) => item.name.toLowerCase() === req.params.dinosaur.toLowerCase() ); if (found) { res.send(found); } else { res.send("No dinosaurs found."); } } }); app.listen(8000); console.log(`Server is running on http://localhost:8000`); ``` Let's run the server with `deno run dev` and check out `localhost:8000/api` in your browser. You should see a list of dinosaurs! ```jsonc [ { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." }, { "name": "Abelisaurus", "description": "\"Abel's lizard\" has been reconstructed from a single skull." }, { "name": "Abrictosaurus", "description": "An early relative of Heterodontosaurus." }, ... ``` You can also get the details of a specific dinosaur by visiting "/api/dinosaur name", for example `localhost:8000/api/aardonyx` will display: ```json { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." } ``` 🦕 Now you're all set to use Express with Deno. You could consider expanding this example into a dinosaur web app. Or take a look at [Deno's built in HTTP server](https://docs.deno.com/runtime/fundamentals/http_server/). --- # Fetch and stream data > A tutorial on working with network requests in Deno. Learn how to use the fetch API for HTTP requests, handle responses, implement data streaming, and manage file uploads and downloads. URL: https://docs.deno.com/examples/tutorials/fetch_data Deno brings several familiar Web APIs to the server-side environment. If you've worked with browsers you may recognize the [`fetch()`](/api/web/fetch) method and the [`streams`](/api/web/streams) API, which are used to make network requests and access streams of data over the network. Deno implements these APIs, allowing you to fetch and stream data from the web. ## Fetching data When building a web application, developers will often need to retrieve resources from somewhere else on the web. We can do so with the `fetch` API. We'll look at how to fetch different shapes of data from a url and how to handle an error if the request fails. Create a new file called `fetch.js` and add the following code: ```ts title="fetch.js" // Output: JSON Data const jsonResponse = await fetch("https://api.github.com/users/denoland"); const jsonData = await jsonResponse.json(); console.log(jsonData, "\n"); // Output: HTML Data const textResponse = await fetch("https://deno.land/"); const textData = await textResponse.text(); console.log(textData, "\n"); // Output: Error Message try { await fetch("https://does.not.exist/"); } catch (error) { console.log(error); } ``` You can run this code with the `deno run` command. Because it is fetching data across the network, you need to grant the `--allow-net` permission: ```sh deno run --allow-net fetch.js ``` You should see the JSON data, HTML data as text, and an error message in the console. ## Streaming data Sometimes you may want to send or receive large files over the network. When you don't know the size of a file in advance, streaming is a more efficient way to handle the data. The client can read from the stream until it says it is done. Deno provides a way to stream data using the `Streams API`. We'll look at how to convert a file into a readable or writable stream and how to send and receive files using streams. Create a new file called `stream.js`. We'll use the `fetch` API to retrieve a file. Then we'll use the [`Deno.open`](/api/deno/Deno.open) method to create and open a writable file and the [`pipeTo`](/api/web/~/ReadableStream.pipeTo) method from the Streams API to send the byte stream to the created file. Next, we'll use the `readable` property on a `POST` request to send the bite stream of the file to a server. ```ts title="stream.js" // Receiving a file const fileResponse = await fetch("https://deno.land/logo.svg"); if (fileResponse.body) { const file = await Deno.open("./logo.svg", { write: true, create: true }); await fileResponse.body.pipeTo(file.writable); } // Sending a file const file = await Deno.open("./logo.svg", { read: true }); await fetch("https://example.com/", { method: "POST", body: file.readable, }); ``` You can run this code with the `deno run` command. Because it is fetching data across the network and writing to a file, you need to grant the `--allow-net`, `--allow-write` and `--allow-read` permissions: ```sh deno run --allow-read --allow-write --allow-net stream.js ``` You should see the file `logo.svg` created and populated in the current directory and, if you owned example.com you would see the file being sent to the server. 🦕 Now you know how to fetch and stream data across a network and how to stream that data to and from files! Whether you're serving static files, processing uploads, generating dynamic content or streaming large datasets, Deno’s file handling and streaming capabilities are great tools to have in your developer toolbox! --- # examples/tutorials/file_based_routing.md > Tutorial on implementing file-based routing in Deno. Learn how to create a dynamic routing system similar to Next.js, handle HTTP methods, manage nested routes, and build a flexible server architecture. URL: https://docs.deno.com/examples/tutorials/file_based_routing If you've used frameworks like [Next.js](https://nextjs.org/), you might be familiar with file based routing - you add a file in a specific directory and it automatically becomes a route. This tutorial demonstrates how to create a simple HTTP server that uses file based routing. ## Route requests Create a new file called `server.ts`. This file will be used to route requests. Set up an async function called `handler` that takes a request object as an argument: ```ts title="server.ts" async function handler(req: Request): Promise { const url = new URL(req.url); const path = url.pathname; const method = req.method; let module; try { module = await import(`.${path}.ts`); } catch (_error) { return new Response("Not found", { status: 404 }); } if (module[method]) { return module[method](req); } return new Response("Method not implemented", { status: 501 }); } Deno.serve(handler); ``` The `handler` function sets up a path variable which contains the path, extracted from the request URL, and a method variable which contains the request method. It then tries to import a module based on the path. If the module is not found, it returns a 404 response. If the module is found, it checks if the module has a method handler for the request method. If the method handler is found, it calls the method handler with the request object. If the method handler is not found, it returns a 501 response. Finally, it serves the handler function using `Deno.serve`. > The path could be any valid URL path such as `/users`, `/posts`, etc. For > paths like `/users`, the file `./users.ts` will be imported. However, deeper > paths like `/org/users` will require a file `./org/users.ts`. You can create > nested routes by creating nested directories and files. ## Handle requests Create a new file called `users.ts` in the same directory as `server.ts`. This file will be used to handle requests to the `/users` path. We'll use a `GET` request as an example. You could add more HTTP methods such as `POST`, `PUT`, `DELETE`, etc. In `users.ts`, set up an async function called `GET` that takes a request object as an argument: ```ts title="users.ts" export function GET(_req: Request): Response { return new Response("Hello from user.ts", { status: 200 }); } ``` ## Start the server To start the server, run the following command: ```sh deno run --allow-net --allow-read server.ts ``` This will start the server on `localhost:8080`. You can now make a `GET` request to `localhost:8000/users` and you should see the response `Hello from user.ts`. This command requires the `--allow-net` and `--allow-read` [permissions flags](/runtime/fundamentals/security/) to allow access to the network to start the server and to read the `users.ts` file from the file system. 🦕 Now you can set up routing in your apps based on file structure. You can extend this example to add more routes and methods as needed. Thanks to [@naishe](https://github.com/naishe) for contributing this tutorial. --- # Write a file server > Tutorial on building a file server with Deno. Learn how to handle HTTP requests, serve static files, implement streaming responses, and use the standard library's file server module for production deployments. URL: https://docs.deno.com/examples/tutorials/file_server A file server listens for incoming HTTP requests and serves files from the local file system. This tutorial demonstrates how to create a simple file server using Deno's built-in [file system APIs](/api/deno/file-system). ## Write a simple File Server To start, create a new file called `file-server.ts`. We'll use Deno's built in [HTTP server](/api/deno/~/Deno.serve) to listen for incoming requests. In your new `file-server.ts` file, add the following code: ```ts title="file-server.ts" Deno.serve( { hostname: "localhost", port: 8080 }, async (request) => { const url = new URL(request.url); const filepath = decodeURIComponent(url.pathname); }, ); ``` > If you're not familiar with the `URL` object, you can learn more about it in > the [URL API](https://developer.mozilla.org/en-US/docs/Web/API/URL) > documentation. The > [decodeURIComponent function](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent) > is used to decode the URL-encoded path, in the case that characters have been > percent-encoded.) ### Open a file and stream its contents When a request is received, we'll attempt to open the file specified in the request URL with [`Deno.open`](/api/deno/~/Deno.open). If the requested file exists, we'll convert it into a readable stream of data with the [ReadableStream API](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream), and stream its contents to the response. We don't know how large the requested file might be, so streaming it will prevent memory issues when serving large files or multiple requests concurrently. If the file does not exist, we'll return a "404 Not Found" response. In the body of the request handler, below the two variables, add the following code: ```ts try { const file = await Deno.open("." + filepath, { read: true }); return new Response(file.readable); } catch { return new Response("404 Not Found", { status: 404 }); } ``` ### Run the file server Run your new file server with the `deno run` command, allowing read access and network access: ```shell deno run --allow-read=. --allow-net file-server.ts ``` ## Using the file server provided by the Deno Standard Library Writing a file server from scratch is a good exercise to understand how Deno's HTTP server works. However, writing production ready file server from scratch can be complex and error-prone. It's better to use a tested and reliable solution. The Deno Standard Library provides you with a [file server](https://jsr.io/@std/http/doc/file-server/~) so that you don't have to write your own. To use it, first install the remote script to your local file system: ```shell # Deno 1.x deno install --allow-net --allow-read jsr:@std/http/file-server # Deno 2.x deno install --global --allow-net --allow-read jsr:@std/http/file-server ``` > This will install the script to the Deno installation root, e.g. > `/home/user/.deno/bin/file-server`. You can now run the script with the simplified script name: ```shell $ file-server . Listening on: - Local: http://0.0.0.0:8000 ``` To see the complete list of options available with the file server, run `file-server --help`. If you visit [http://0.0.0.0:8000/](http://0.0.0.0:8000/) in your web browser you will see the contents of your local directory. ### Using the @std/http file server in a Deno project To use the file-server in a [Deno project](/runtime/getting_started/first_project), you can add it to your `deno.json` file with: ```sh deno add jsr:@std/http ``` And then import it in your project: ```ts title="file-server.ts" import { serveDir } from "@std/http/file-server"; Deno.serve((req) => { const pathname = new URL(req.url).pathname; if (pathname.startsWith("/static")) { return serveDir(req, { fsRoot: "path/to/static/files/dir", }); } return new Response(); }); ``` This code will set up an HTTP server with `Deno.serve`. When a request comes in, it checks if the requested path starts with “/static”. If so, it serves files from the specified directory. Otherwise, it responds with an empty response. 🦕 Now you know how to write your own simple file server, and how to use the file-server utility provided by the Deno Standard Library. You're equipped to tackle a whole variety of tasks - whether it’s serving static files, handling uploads, transforming data, or managing access control - you're ready to serve files with Deno. --- # File system events > Tutorial on monitoring file system changes with Deno. Learn how to watch directories for file modifications, handle change events, and understand platform-specific behaviors across Linux, macOS, and Windows. URL: https://docs.deno.com/examples/tutorials/file_system_events ## Concepts - Use [Deno.watchFs](https://docs.deno.com/api/deno/~/Deno.watchFs) to watch for file system events. - Results may vary between operating systems. ## Example To poll for file system events in the current directory: ```ts title="watcher.ts" const watcher = Deno.watchFs("."); for await (const event of watcher) { console.log(">>>> event", event); // Example event: { kind: "create", paths: [ "/home/alice/deno/foo.txt" ] } } ``` Run with: ```shell deno run --allow-read watcher.ts ``` Now try adding, removing and modifying files in the same directory as `watcher.ts`. Note that the exact ordering of the events can vary between operating systems. This feature uses different syscalls depending on the platform: - Linux: [inotify](https://man7.org/linux/man-pages/man7/inotify.7.html) - macOS: [FSEvents](https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/FSEvents_ProgGuide/Introduction/Introduction.html) - Windows: [ReadDirectoryChangesW](https://docs.microsoft.com/en-us/windows/win32/api/winbase/nf-winbase-readdirectorychangesw) --- # How to deploy to Google Cloud Run > Step-by-step guide to deploying Deno applications on Google Cloud Run. Learn about Docker containerization, Artifact Registry configuration, GitHub Actions automation, and how to set up continuous deployment to Google Cloud. URL: https://docs.deno.com/examples/tutorials/google_cloud_run [Google Cloud Run](https://cloud.google.com/run) is a managed compute platform that lets you run containers on Google's scalable infrastructure. This How To guide will show you how to use Docker to deploy your Deno app to Google Cloud Run. First, we'll show you how to deploy manually, then we'll show you how to automate it with GitHub Actions. Pre-requisites: - [Google Cloud Platform account](https://cloud.google.com/gcp) - [`docker` CLI](https://docs.docker.com/engine/reference/commandline/cli/) installed - [`gcloud`](https://cloud.google.com/sdk/gcloud) installed ## Manual Deployment ### Create `Dockerfile` and `docker-compose.yml` To focus on the deployment, our app will simply be a `main.ts` file that returns a string as an HTTP response: ```ts title="main.ts" import { Application } from "jsr:@oak/oak"; const app = new Application(); app.use((ctx) => { ctx.response.body = "Hello from Deno and Google Cloud Run!"; }); await app.listen({ port: 8000 }); ``` Then, we'll create two files -- `Dockerfile` and `docker-compose.yml` -- to build the Docker image. In our `Dockerfile`, let's add: ```Dockerfile FROM denoland/deno EXPOSE 8000 WORKDIR /app ADD . /app RUN deno install --entrypoint main.ts CMD ["run", "--allow-net", "main.ts"] ``` Then, in our `docker-compose.yml`: ```yml version: "3" services: web: build: . container_name: deno-container image: deno-image ports: - "8000:8000" ``` Let's test this locally by running `docker compose -f docker-compose.yml build`, then `docker compose up`, and going to `localhost:8000`. ![Hello from localhost](./images/how-to/google-cloud-run/hello-world-from-localhost.png) It works! ### Set up Artifact Registry Artifact Registry is GCP's private registry of Docker images. Before we can use it, go to GCP's [Artifact Registry](https://console.cloud.google.com/artifacts) and click "Create repository". You'll be asked for a name (`deno-repository`) and a region (`us-central1`). Then click "Create". ![New repository in Google Artifact Repository](./images/how-to/google-cloud-run/new-repository-in-google-artifact-repository.png) ### Build, Tag, and Push to Artifact Registry Once we've created a repository, we can start pushing images to it. First, let's add the registry's address to `gcloud`: ```shell gcloud auth configure-docker us-central1-docker.pkg.dev ``` Then, let's build your Docker image. (Note that the image name is defined in our `docker-compose.yml` file.) ```shell docker compose -f docker-compose.yml build ``` Then, [tag](https://docs.docker.com/engine/reference/commandline/tag/) it with the new Google Artifact Registry address, repository, and name. The image name should follow this structure: `{{ location }}-docker.pkg.dev/{{ google_cloudrun_project_name }}/{{ repository }}/{{ image }}`. ```shell docker tag deno-image us-central1-docker.pkg.dev/deno-app-368305/deno-repository/deno-cloudrun-image ``` If you don't specify a tag, it'll use `:latest` by default. Next, push the image: ```shell docker push us-central1-docker.pkg.dev/deno-app-368305/deno-repository/deno-cloudrun-image ``` _[More info on how to push and pull images to Google Artifact Registry](https://cloud.google.com/artifact-registry/docs/docker/pushing-and-pulling)._ Your image should now appear in your Google Artifact Registry! ![Image in Google Artifact Registry](./images/how-to/google-cloud-run/image-in-google-artifact-registry.png) ### Create a Google Cloud Run Service We need an instance where we can build these images, so let's go to [Google Cloud Run](https://console.cloud.google.com/run) and click "Create Service". Let's name it "hello-from-deno". Select "Deploy one revision from an existing container image". Use the drop down to select the image from the `deno-repository` Artifact Registry. Select "allow unauthenticated requests" and then click "Create service". Make sure the port is `8000`. When it's done, your app should now be live: ![Hello from Google Cloud Run](./images/how-to/google-cloud-run/hello-from-google-cloud-run.png) Awesome! ### Deploy with `gcloud` Now that it's created, we'll be able to deploy to this service from the `gcloud` CLI. The command follows this structure: `gcloud run deploy {{ service_name }} --image={{ image }} --region={{ region }} --allow-unauthenticated`. Note that the `image` name follows the structure from above. For this example, the command is: ```shell gcloud run deploy hello-from-deno --image=us-central1-docker.pkg.dev/deno-app-368305/deno-repository/deno-cloudrun-image --region=us-central1 --allow-unauthenticated ``` ![Hello from Google Cloud Run](./images/how-to/google-cloud-run/hello-from-google-cloud-run.png) Success! ## Automate Deployment with GitHub Actions In order for automation to work, we first need to make sure that these both have been created: - the Google Artifact Registry - the Google Cloud Run service instance (If you haven't done that, please see the section before.) Now that we have done that, we can automate it with a GitHub workflow. Here's the yaml file: ```yml name: Build and Deploy to Cloud Run on: push: branches: - main env: PROJECT_ID: { { PROJECT_ID } } GAR_LOCATION: { { GAR_LOCATION } } REPOSITORY: { { GAR_REPOSITORY } } SERVICE: { { SERVICE } } REGION: { { REGION } } jobs: deploy: name: Deploy permissions: contents: "read" id-token: "write" runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Google Auth id: auth uses: "google-github-actions/auth@v0" with: credentials_json: "${{ secrets.GCP_CREDENTIALS }}" - name: Login to GAR uses: docker/login-action@v2.1.0 with: registry: ${{ env.GAR_LOCATION }}-docker.pkg.dev username: _json_key password: ${{ secrets.GCP_CREDENTIALS }} - name: Build and Push Container run: |- docker build -t "${{ env.GAR_LOCATION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.REPOSITORY }}/${{ env.SERVICE }}:${{ github.sha }}" ./ docker push "${{ env.GAR_LOCATION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.REPOSITORY }}/${{ env.SERVICE }}:${{ github.sha }}" - name: Deploy to Cloud Run id: deploy uses: google-github-actions/deploy-cloudrun@v0 with: service: ${{ env.SERVICE }} region: ${{ env.REGION }} image: ${{ env.GAR_LOCATION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.REPOSITORY }}/${{ env.SERVICE }}:${{ github.sha }} - name: Show Output run: echo ${{ steps.deploy.outputs.url }} ``` The environment variables that we need to set are (the examples in parenthesis are the ones for this repository) - `PROJECT_ID`: your project id (`deno-app-368305`) - `GAR_LOCATION`: the location your Google Artifact Registry is set (`us-central1`) - `GAR_REPOSITORY`: the name you gave your Google Artifact Registry (`deno-repository`) - `SERVICE`: the name of the Google Cloud Run service (`hello-from-deno`) - `REGION`: the region of your Google Cloud Run service (`us-central1`) The secret variables that we need to set are: - `GCP_CREDENTIALS`: this is the [service account](https://cloud.google.com/iam/docs/service-accounts) json key. When you create the service account, be sure to [include the roles and permissions necessary](https://cloud.google.com/iam/docs/granting-changing-revoking-access#granting_access_to_a_user_for_a_service_account) for Artifact Registry and Google Cloud Run. [Check out more details and examples of deploying to Cloud Run from GitHub Actions.](https://github.com/google-github-actions/deploy-cloudrun) For reference: https://github.com/google-github-actions/example-workflows/blob/main/workflows/deploy-cloudrun/cloudrun-docker.yml --- # How to export telemetry data to Grafana > Complete guide to exporting telemetry data with OpenTelemetry and Grafana. Learn how to configure collectors, visualize traces, and monitor application performance. URL: https://docs.deno.com/examples/tutorials/grafana [OpenTelemetry](https://opentelemetry.io/) (often abbreviated as OTel) is an open-source observability framework that provides a standardized way to collect and export telemetry data such as traces, metrics and logs. Deno has built-in support for OpenTelemetry, making it easy to instrument your applications without adding external dependencies. This integration works out of the box with observability platforms like [Grafana](https://grafana.com/). Grafana is an open-source observability platform that lets DevOps teams visualize, query, and alert on metrics, logs, and traces from diverse data sources in real time. It’s widely used for building dashboards to monitor infrastructure, applications, and systems health. Grafana also offers a hosted version called [Grafana Cloud](https://grafana.com/products/cloud/). This tutorial will help you configure your project to export OTel data to Grafana Cloud. In this tutorial, we'll build a simple application and export its telemetry data to Grafana Cloud. We'll cover: - [Set up your chat app](#set-up-your-chat-app) - [Set up a Docker collector](#set-up-a-docker-collector) - [Generating telemetry data](#generating-telemetry-data) - [Viewing telemetry data](#viewing-telemetry-data) You can find the complete source code for this tutorial [on GitHub](https://github.com/denoland/examples/tree/main/with-grafana). ## Set up your chat app For this tutorial, we'll use a simple chat application to demonstrate how to export telemetry data. You can find the [code for the app on GitHub](https://github.com/denoland/examples/tree/main/with-grafana). Either take a copy of that repository or create a [main.ts](https://github.com/denoland/examples/blob/main/with-grafana/main.ts) file and a [.env](https://github.com/denoland/examples/blob/main/with-grafana/.env.example) file. In order to run the app you will need an OpenAI API key. You can get one by signing up for an account at [OpenAI](https://platform.openai.com/signup) and creating a new secret key. You can find your API key in the [API keys section](https://platform.openai.com/account/api-keys) of your OpenAI account. Once you have an API key, set up an `OPENAI_API-KEY` environment variable in your `.env` file: ```env title=".env" OPENAI_API_KEY=your_openai_api_key ``` ## Set up a Docker collector Next, we'll set up a Docker container to run the OpenTelemetry collector. The collector is responsible for receiving telemetry data from your application and exporting it to Grafana Cloud. In the same directory as your `main.ts` file, create a `Dockerfile` and an `otel-collector.yml` file. The `Dockerfile` will be used to build a Docker image: ```dockerfile title="Dockerfile" FROM otel/opentelemetry-collector-contrib:latest COPY otel-collector.yml /otel-config.yml CMD ["--config", "/otel-config.yml"] ``` [`FROM otel/opentelemetry-collector-contrib:latest`](https://hub.docker.com/r/otel/opentelemetry-collector-contrib/) - This line specifies the base image for the container. It uses the official OpenTelemetry Collector Contributor image, which contains all receivers, exporters, processors, connectors, and other optional components, and pulls the latest version. `COPY otel-collector.yml /otel-config.yml` - This instruction copies our configuration file named `otel-collector.yml` from the local build context into the container. The file is renamed to `/otel-config.yml` inside the container. `CMD ["--config", "/otel-config.yml"]` - This sets the default command that will run when the container starts. It tells the OpenTelemetry Collector to use the configuration file we copied in the previous step. Next, let's setup a Grafana Cloud account and grab some info. If you have not already, [create a free Grafana Cloud account](https://grafana.com/auth/sign-up/create-user). Once created, you will receive a Grafana Cloud stack. Click "Details". ![Click details on your Grafana Cloud stack](./images/how-to/grafana/grafana-1.png) Next, find "OpenTelemetry" and click "Configure". ![Find and configure OpenTelemetry](./images/how-to/grafana/grafana-2.png) This page will provide you with all the details you'll need to configure your OpenTelemetry collector. Make note of your **OTLP Endpoint**, **Instance ID**, and **Password / API Token** (you will have to generate one). ![Configuring OTel in Grafana Cloud](./images/how-to/grafana/grafana-3.png) Next, add the following to your `otel-collector.yml` file to define how how telemetry data should be collected and exported to Grafana Cloud: ```yml title="otel-collector.yml" receivers: otlp: protocols: grpc: endpoint: 0.0.0.0:4317 http: endpoint: 0.0.0.0:4318 exporters: otlphttp/grafana_cloud: endpoint: $_YOUR_GRAFANA_OTLP_ENDPOINT auth: authenticator: basicauth/grafana_cloud extensions: basicauth/grafana_cloud: client_auth: username: $_YOUR_INSTANCE_ID password: $_YOUR_API_TOKEN processors: batch: service: extensions: [basicauth/grafana_cloud] pipelines: traces: receivers: [otlp] processors: [batch] exporters: [otlphttp/grafana_cloud] metrics: receivers: [otlp] processors: [batch] exporters: [otlphttp/grafana_cloud] logs: receivers: [otlp] processors: [batch] exporters: [otlphttp/grafana_cloud] ``` The `receivers` section configures how the collector receives data. It sets up an OTLP (OpenTelemetry Protocol) receiver that listens on two protocols, `gRPC` and `HTTP`, the `0.0.0.0` address means it will accept data from any source. The `exporters` section defines where the collected data should be sent. Be sure to include **the OTLP endpoint** provided by your Grafana Cloud instance. The `extensions` section defines the authentication for OTel to export data to Grafana Cloud. Be sure to include your Grafana Cloud **Instance ID**, as well as your generated **Password / API Token**. The `processors` section defines how the data should be processed before export. It uses batch processing with a timeout of 5 seconds and a maximum batch size of 5000 items. The `service` section ties everything together by defining three pipelines. Each pipeline is responsible for a different type of telemetry data. The logs pipeline collects application logs. The traces pipeline is for distributed tracing data. The metric pipeline is for performance metrics. Build and run the docker instance to start collecting your telemetry data with the following command: ```sh docker build -t otel-collector . && docker run -p 4317:4317 -p 4318:4318 otel-collector ``` ## Generating telemetry data Now that we have the app and the docker container set up, we can start generating telemetry data. Run your application with these environment variables to send data to the collector: ```sh OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318 \ OTEL_SERVICE_NAME=chat-app \ OTEL_DENO=true \ deno run --unstable-otel --allow-net --allow-env --env-file --allow-read main.ts ``` This command: - Points the OpenTelemetry exporter to your local collector (`localhost:4318`) - Names your service "chat-app" in Grafana Cloud - Enables Deno's OpenTelemetry integration - Runs your application with the necessary permissions To generate some telemetry data, make a few requests to your running application in your browser at [`http://localhost:8000`](http://localhost:8000). Each request will: 1. Generate traces as it flows through your application 2. Send logs from your application's console output 3. Create metrics about the request performance 4. Forward all this data through the collector to Grafana Cloud ## Viewing telemetry data After making some requests to your application, you'll see three types of data in your Grafana Cloud dashboard: 1. **Traces** - End-to-end request flows through your system 2. **Logs** - Console output and structured log data 3. **Metrics** - Performance and resource utilization data ![Viewing logs in Grafana](./images/how-to/grafana/grafana-logs.png) You can drill down into individual spans to debug performance issues: ![Viewing traces in Grafana](./images/how-to/grafana/grafana-traces.png) 🦕 Now that you have telemetry export working, you could: 1. Add custom spans and attributes to better understand your application 2. Set up alerts based on latency or error conditions 3. Deploy your application and collector to production using platforms like: - [Fly.io](https://docs.deno.com/examples/deploying_deno_with_docker/) - [Digital Ocean](https://docs.deno.com/examples/digital_ocean_tutorial/) - [AWS Lightsail](https://docs.deno.com/examples/aws_lightsail_tutorial/) For more details on OpenTelemetry configuration, check out the [Grafana Cloud documentation](https://grafana.com/docs/grafana-cloud/monitor-applications/application-observability/collector/). --- # Executable scripts > Guide to creating executable scripts with Deno. Learn about hashbangs, file permissions, cross-platform compatibility, and how to create command-line tools that can run directly from the terminal. URL: https://docs.deno.com/examples/tutorials/hashbang Making Deno scripts executable can come in handy when creating small tools or utilities for tasks like file manipulation, data processing or repetitive tasks that you might want to run from the command line. Executable scripts allow you to create ad-hoc solutions without setting up an entire project. ## Creating an example script To make a script executable, start the script with a hashbang, (sometimes called a shebang). This is a sequence of characters (#!) that tells your operating system how to execute a script. It is followed by the path to the interpreter that should be used to run the script. :::note To use a hashbang on Windows you will need to install the Windows Subsystem for Linux (WSL) or use a Unix-like shell like [Git Bash](https://git-scm.com/downloads). ::: We'll make a simple script that prints the Deno installation path using the [Deno.env](/api/deno/~/Deno.env) API. Create a file named `hashbang.ts` with the following content: ```ts title="hashbang.ts" #!/usr/bin/env -S deno run --allow-env const path = Deno.env.get("DENO_INSTALL"); console.log("Deno Install Path:", path); ``` This script tells the system to use the deno runtime to run the script. The -S flag splits the command into arguments and indicates that the following argument (`deno run --allow-env`) should be passed to the env command. The script then retrieves the value associated with the environment variable named `DENO_INSTALL` with `Deno.env.get()` and assigns it to a variable called `path`. Finally, it prints the path to the console using `console.log()`. ### Execute the script In order to execute the script, you may need to give the script execution permissions, you can do so using the `chmod` command with a `+x` flag (for execute): ```sh chmod +x hashbang.ts ``` You can execute the script directly in the command line with: ```sh ./hashbang.ts ``` ## Using hashbang in files with no extension For brevity, you may wish to omit the extension for your script's filename. In this case, supply one using the `--ext` flag in the script itself, then you can run the script with just the file name: ```shell title="my_script" $ cat my_script #!/usr/bin/env -S deno run --allow-env --ext=js console.log("Hello!"); $ ./my_script Hello! ``` 🦕 Now you can directly execute Deno scripts from the command line! Remember to set the execute permission (`chmod +x`) for your script file, and you’re all set to build anything from simple utilities to complex tools. Check out the [Deno examples](/examples/) for inspiration on what you can script. --- # How to export telemetry data to Honeycomb > Complete guide to exporting telemetry data with OpenTelemetry and Honeycomb.io. Learn how to configure collectors, visualize traces, and monitor application performance. URL: https://docs.deno.com/examples/tutorials/honeycomb [OpenTelemetry](https://opentelemetry.io/) (often abbreviated as OTel) is an open-source observability framework that provides a standardized way to collect and export telemetry data such as traces, metrics and logs. Deno has built-in support for OpenTelemetry, making it easy to instrument your applications without adding external dependencies. This integration works out of the box with observability platforms like [Honeycomb](https://honeycomb.io). Honeycomb is an observability platform designed for debugging and understanding complex, modern distributed systems. In this tutorial, we'll build a simple application and export its telemetry data to Honeycomb. We'll cover: - [Set up your chat app](#set-up-your-chat-app) - [Set up a Docker collector](#set-up-a-docker-collector) - [Generating telemetry data](#generating-telemetry-data) - [Viewing telemetry data](#viewing-telemetry-data) You can find the complete source code for this tutorial [on GitHub](https://github.com/denoland/examples/tree/main/with-honeycomb). ## Set up your chat app For this tutorial, we'll use a simple chat application to demonstrate how to export telemetry data. You can find the [code for the app on GitHub](https://github.com/denoland/examples/tree/main/with-honeycomb). Either take a copy of that repository or create a [main.ts](https://github.com/denoland/examples/blob/main/with-honeycomb/main.ts) file and a [.env](https://github.com/denoland/examples/blob/main/with-honeycomb/.env.example) file. In order to run the app you will need an OpenAI API key. You can get one by signing up for an account at [OpenAI](https://platform.openai.com/signup) and creating a new secret key. You can find your API key in the [API keys section](https://platform.openai.com/account/api-keys) of your OpenAI account. Once you have an API key, set up an `OPENAI_API-KEY` environment variable in your `.env` file: ```env title=".env" OPENAI_API_KEY=your_openai_api_key ``` ## Set up a Docker collector Next, we'll set up a Docker container to run the OpenTelemetry collector. The collector is responsible for receiving telemetry data from your application and exporting it to Honeycomb. If you have not already, create a free Honeycomb account and set up an [ingest API key](https://docs.honeycomb.io/configure/environments/manage-api-keys/). In the same directory as your `main.ts` file, create a `Dockerfile` and an `otel-collector.yml` file. The `Dockerfile` will be used to build a Docker image: ```dockerfile title="Dockerfile" FROM otel/opentelemetry-collector:latest COPY otel-collector.yml /otel-config.yml CMD ["--config", "/otel-config.yml"] ``` `FROM otel/opentelemetry-collector:latest` - This line specifies the base image for the container. It uses the official OpenTelemetry Collector image and pulls the latest version. `COPY otel-collector.yml /otel-config.yml` - This instruction copies our configuration file named `otel-collector.yml` from the local build context into the container. The file is renamed to `/otel-config.yml` inside the container. `CMD ["--config", "/otel-config.yml"]` - This sets the default command that will run when the container starts. It tells the OpenTelemetry Collector to use the configuration file we copied in the previous step. Next, add the following to your `otel-collector.yml` file to define how how telemetry data should be collected and exported to Honeycomb: ```yml title="otel-collector.yml" receivers: otlp: protocols: grpc: endpoint: 0.0.0.0:4317 http: endpoint: 0.0.0.0:4318 exporters: otlp: endpoint: "api.honeycomb.io:443" headers: x-honeycomb-team: $_HONEYCOMB_API_KEY processors: batch: timeout: 5s send_batch_size: 5000 service: pipelines: logs: receivers: [otlp] processors: [batch] exporters: [otlp] traces: receivers: [otlp] processors: [batch] exporters: [otlp] metrics: receivers: [otlp] processors: [batch] exporters: [otlp] ``` The `receivers` section configures how the collector receives data. It sets up an OTLP (OpenTelemetry Protocol) receiver that listens on two protocols, `gRPC` and `HTTP`, the `0.0.0.0` address means it will accept data from any source. The `exporters` section defines where the collected data should be sent. It's configured to send data to Honeycomb's API endpoint at `api.honeycomb.io:443`. The configuration requires an API key for authentication, swap `$_HONEYCOMB_API_KEY` for your actual Honeycomb API key. The `processors` section defines how the data should be processed before export. It uses batch processing with a timeout of 5 seconds and a maximum batch size of 5000 items. The `service` section ties everything together by defining three pipelines. Each pipeline is responsible for a different type of telemetry data. The logs pipeline collects application logs. The traces pipeline is for distributed tracing data. The metric pipeline is for performance metrics. Build and run the docker instance to start collecting your telemetry data with the following command: ```sh docker build -t otel-collector . && docker run -p 4317:4317 -p 4318:4318 otel-collector ``` ## Generating telemetry data Now that we have the app and the docker container set up, we can start generating telemetry data. Run your application with these environment variables to send data to the collector: ```sh OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318 \ OTEL_SERVICE_NAME=chat-app \ OTEL_DENO=true \ deno run --unstable-otel --allow-net --allow-env --env-file --allow-read main.ts ``` This command: - Points the OpenTelemetry exporter to your local collector (`localhost:4318`) - Names your service "chat-app" in Honeycomb - Enables Deno's OpenTelemetry integration - Runs your application with the necessary permissions To generate some telemetry data, make a few requests to your running application in your browser at [`http://localhost:8000`](http://localhost:8000). Each request will: 1. Generate traces as it flows through your application 2. Send logs from your application's console output 3. Create metrics about the request performance 4. Forward all this data through the collector to Honeycomb ## Viewing telemetry data After making some requests to your application, you'll see three types of data in your Honeycomb.io dashboard: 1. **Traces** - End-to-end request flows through your system 2. **Logs** - Console output and structured log data 3. **Metrics** - Performance and resource utilization data ![Viewing traces in Honeycomb](./images/how-to/honeycomb/honeycomb-3.webp) You can drill down into individual spans to debug performance issues: ![Viewing expanded traces in Honeycomb](./images/how-to/honeycomb/honeycomb-4.webp) 🦕 Now that you have telemetry export working, you could: 1. Add custom spans and attributes to better understand your application 2. Set up alerts based on latency or error conditions 3. Deploy your application and collector to production using platforms like: - [Fly.io](https://docs.deno.com/examples/deploying_deno_with_docker/) - [Digital Ocean](https://docs.deno.com/examples/digital_ocean_tutorial/) - [AWS Lightsail](https://docs.deno.com/examples/aws_lightsail_tutorial/) For more details on OpenTelemetry configuration, check out the [Honeycomb documentation](https://docs.honeycomb.io/send-data/opentelemetry/collector/). --- # How to export telemetry data to HyperDX > Complete guide to exporting telemetry data with OpenTelemetry and HyperDX. Learn how to configure collectors, visualize traces, logs, metrics, and debug distributed applications effectively. URL: https://docs.deno.com/examples/tutorials/hyperdx [HyperDX](https://hyperdx.io) is an open source observability platform that unifies logs, traces, metrics, exceptions, and session replays into a single interface. It helps developers debug applications faster by providing a complete view of your system's behavior and performance. [OpenTelemetry](https://opentelemetry.io/) (often abbreviated as OTel) provides a standardized way to collect and export telemetry data. Deno includes built-in OpenTelemetry support, allowing you to instrument your applications without additional dependencies. This integration works seamlessly with platforms like HyperDX to collect and visualize telemetry data. In this tutorial, we'll build a simple application and export its telemetry data to HyperDX: - [Set up your chat app](#set-up-your-chat-app) - [Set up a Docker collector](#set-up-a-docker-collector) - [Generating telemetry data](#generating-telemetry-data) - [Viewing telemetry data](#viewing-telemetry-data) You can find the complete source code for this tutorial [on GitHub](https://github.com/denoland/examples/tree/main/with-hyperdx). ## Set up the app For this tutorial, we'll use a simple chat application to demonstrate how to export telemetry data. You can find the [code for the app on GitHub](https://github.com/denoland/examples/tree/main/with-hyperdx). Either take a copy of that repository or create a [main.ts](https://github.com/denoland/examples/blob/main/with-hyperdx/main.ts) file and a [.env](https://github.com/denoland/examples/blob/main/with-hyperdx/.env.example) file. In order to run the app you will need an OpenAI API key. You can get one by signing up for an account at [OpenAI](https://platform.openai.com/signup) and creating a new secret key. You can find your API key in the [API keys section](https://platform.openai.com/account/api-keys) of your OpenAI account. Once you have an API key, set up an `OPENAI_API-KEY` environment variable in your `.env` file: ```env title=".env" OPENAI_API_KEY=your_openai_api_key ``` ## Set up the collector First, create a free HyperDX account to get your API key. Then, we'll set up two files to configure the OpenTelemetry collector: 1. Create a `Dockerfile`: ```dockerfile title="Dockerfile" FROM otel/opentelemetry-collector:latest COPY otel-collector.yml /otel-config.yml CMD ["--config", "/otel-config.yml"] ``` This Dockerfile: - Uses the official OpenTelemetry Collector as the base image - Copies your configuration into the container - Sets up the collector to use your config when it starts 2. Create a file called `otel-collector.yml`: ```yml title="otel-collector.yml" receivers: otlp: protocols: grpc: endpoint: 0.0.0.0:4317 http: endpoint: 0.0.0.0:4318 exporters: otlphttp/hdx: endpoint: "https://in-otel.hyperdx.io" headers: authorization: $_HYPERDX_API_KEY compression: gzip processors: batch: service: pipelines: traces: receivers: [otlp] processors: [batch] exporters: [otlphttp/hdx] metrics: receivers: [otlp] processors: [batch] exporters: [otlphttp/hdx] logs: receivers: [otlp] processors: [batch] exporters: [otlphttp/hdx] ``` This configuration file sets up the OpenTelemetry collector to receive telemetry data from your application and export it to HyperDX. It includes: - The receivers section accepts data via gRPC (4317) and HTTP (4318) - The Exporters section sends data to HyperDX with compression and authentication - The processors section batches telemetry data for efficient transmission - The pipelines section defines separate flows for logs, traces, and metrics Build and run the docker instance to start collecting your telemetry data with the following command: ```sh docker build -t otel-collector . && docker run -p 4317:4317 -p 4318:4318 otel-collector ``` ## Generating telemetry data Now that we have the app and the docker container set up, we can start generating telemetry data. Run your application with these environment variables to send data to the collector: ```sh OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318 \ OTEL_SERVICE_NAME=chat-app \ OTEL_DENO=true \ deno run --unstable-otel --allow-net --allow-env --env-file --allow-read main.ts ``` This command: - Points the OpenTelemetry exporter to your local collector (`localhost:4318`) - Names your service "chat-app" in HyperDX - Enables Deno's OpenTelemetry integration - Runs your application with the necessary permissions To generate some telemetry data, make a few requests to your running application in your browser at [`http://localhost:8000`](http://localhost:8000). Each request will: 1. Generate traces as it flows through your application 2. Send logs from your application's console output 3. Create metrics about the request performance 4. Forward all this data through the collector to HyperDX ## Viewing telemetry data In your HyperDX dashboard, you'll see different views of your telemetry data: ### Logs View ![Viewing logs in HyperDX](./images/how-to/hyperdx/hyperdx-1.webp) Click any log to see details: ![Viewing a single log in HyperDX](./images/how-to/hyperdx/hyperdx-2.webp) ### Request Traces See all logs within a single request: ![Viewing all logs in a request in HyperDX](./images/how-to/hyperdx/hyperdx-3.webp) ### Metrics Dashboard Monitor system performance: ![Viewing metrics in HyperDX](./images/how-to/hyperdx/hyperdx-4.webp) 🦕 Now that you have telemetry export working, you could: 1. Add custom spans and attributes to better understand your application 2. Set up alerts based on latency or error conditions 3. Deploy your application and collector to production using platforms like: - [Fly.io](https://docs.deno.com/examples/deploying_deno_with_docker/) - [Digital Ocean](https://docs.deno.com/examples/digital_ocean_tutorial/) - [AWS Lightsail](https://docs.deno.com/examples/aws_lightsail_tutorial/) 🦕 For more details on OpenTelemetry configuration with HyperDX, see their [documentation](https://www.hyperdx.io/docs/install/opentelemetry). --- # Initialize a project > Guide to creating and structuring new Deno projects. Learn about starting a new project, task configuration, dependency management, and best practices for growing applications. URL: https://docs.deno.com/examples/tutorials/initialize_project While it is possible to run scripts directly with `deno run`, for larger projects it is recommended to create a sensible directory structure. This way you can organize your code, manage dependencies, script tasks and run tests more easily. Initialize a new project by running the following command: ```sh deno init my_project ``` Where `my_project` is the name of your project. You can [read more about the project structure](/runtime/getting_started/first_project/). ### Run your project Navigate to the project directory: ```sh cd my_project ``` Then you can run the project directly using the `deno task` command: ```sh deno run dev ``` Take a look in the `deno.json` file in your new project. You should see a `dev` task in the "tasks" field. ```json title="deno.json" "tasks": { "dev": "deno run --watch main.ts" }, ``` The `dev` task is a common task that runs the project in development mode. As you can see, it runs the `main.ts` file with the `--watch` flag, which will automatically reload the script when changes are made. You can see this in action if you open the `main.ts` file and make a change. ### Run the tests In the project directory run: ```sh deno test ``` This will execute all the tests in the project. You can read more about [testing in Deno](/runtime/fundamentals/testing/) and we'll cover tests in a little more depth in a later tutorial. At the moment you have one test file, `main_test.ts`, which tests the `add` function in `main.ts`. ### Adding to your project The `main.ts` file serves as the entry point for your application. It’s where you’ll write your main program logic. When developing your project you will start by removing the default addition program and replace it with your own code. For example, if you’re building a web server, this is where you’d set up your routes and handle requests. Beyond the initial files, you’ll likely create additional modules (files) to organize your code. Consider grouping related functionality into separate files. Remember that Deno [supports ES modules](/runtime/fundamentals/modules/), so you can use import and export statements to structure your code. Example folder structure for a deno project: ```sh my_project/ ├── deno.json ├── main.ts ├── main_test.ts ├── routes/ │ ├── home.ts │ ├── about.ts ├── services/ │ ├── user.ts │ ├── post.ts └──utils/ ├── logger.ts ├── logger_test.ts ├── validator_test.ts └── validator.ts ``` This kind of structure keeps your project clean and makes it easier to find and manage files. 🦕 Congratulations! Now you know how to create a brand new project with `deno init`. Remember that Deno encourages simplicity and avoids complex build tools. Keep your project modular, testable, and organized. As your project grows, adapt the structure to fit your needs. And most importantly, have fun exploring Deno’s capabilities! --- # How to deploy Deno on Kinsta > Step-by-step guide to deploying Deno applications on Kinsta. Learn how to configure package.json, handle environment variables, set up Git deployments, and use Kinsta's application hosting platform. URL: https://docs.deno.com/examples/tutorials/kinsta [Kinsta Application Hosting](https://kinsta.com/application-hosting) is a service that lets you build and deploy your web apps directly from your Git repository. ## Preparing your application At **Kinsta**, we recommend using the [`deno-bin`](https://www.npmjs.com/package/deno-bin) package to run Deno applications. To do so, your `package.json` should look like this: ```json title="package.json" { "name": "deno app", "scripts": { "start": "deno run --allow-net index.js --port=${PORT}" }, "devDependencies": { "deno-bin": "^1.28.2" } } ``` ## Example application ```js import { parseArgs } from "jsr:@std/cli"; const { args } = Deno; const port = parseArgs(args).port ? Number(parseArgs(args).port) : 8000; Deno.serve({ port }, (_req) => new Response("Hello, world")); ``` The application itself is self-explanatory. It's crucial not to hardcode the `PORT` but use the environmental variable **Kinsta** provides. There is also a [repository](https://github.com/kinsta/hello-world-deno) that should help you to get started. ## Deployment 1. Register on [Kinsta Application Hosting](https://kinsta.com/signup/?product_type=app-db) or login directly to [My Kinsta](https://my.kinsta.com/) admin panel. 2. Go to the Applications tab. 3. Connect your GitHub repository. 4. Press the **Add service > Application button**. 5. Follow the wizard steps. --- # Testing in isolation with mocks > Master the art of mocking in your unit tests. Learn how spies, stubs, fake time, and other Deno tools let you improve your code and confidence URL: https://docs.deno.com/examples/tutorials/mocking This guide builds on the [basics of testing in Deno](/examples/testing_tutorial/) to focus specifically on mocking techniques that help you isolate your code during testing. For effective unit testing, you'll often need to "mock" the data that your code interacts with. Mocking is a technique used in testing where you replace real data with simulated versions that you can control. This is particularly useful when testing components that interact with external services, such as APIs or databases. Deno provides [helpful mocking utilities](https://jsr.io/@std/testing/doc/mock) through the Deno Standard Library, making your tests easier to write, more reliable and faster. ### Spying In Deno, you can [`spy`](https://jsr.io/@std/testing/doc/mock#spying) on a function to track how it's called during test execution. Spies don't change how a function behaves, but they record important details like how many times the function was called and what arguments were passed to it. By using spies, you can verify that your code interacts correctly with its dependencies without setting up complex infrastructure. In the following example we will test a function called `saveUser()`, which takes a user object and a database object and calls the database's `save` method: ```ts import { assertEquals } from "jsr:@std/assert"; import { assertSpyCalls, spy } from "jsr:@std/testing/mock"; // Define types for better code quality interface User { name: string; } interface Database { save: (user: User) => Promise; } // Function to test function saveUser( user: User, database: Database, ): Promise { return database.save(user); } // Test with a mock Deno.test("saveUser calls database.save", async () => { // Create a mock database with a spy on the save method const mockDatabase = { save: spy((user: User) => Promise.resolve({ id: 1, ...user })), }; const user: User = { name: "Test User" }; const result = await saveUser(user, mockDatabase); // Verify the mock was called correctly assertSpyCalls(mockDatabase.save, 1); assertEquals(mockDatabase.save.calls[0].args[0], user); assertEquals(result, { id: 1, name: "Test User" }); }); ``` We import the necessary functions from the Deno Standard Library to assert equality and to create and verify spy functions. The mock database is a stand-in for a real database object, with a `save` method that is wrapped in a `spy`. The spy function tracks calls to the method, records arguments passed to it and executes the underlying implementation (in this case returning a promise with the `user` and an `id`). The test calls `saveUser()` with the mock data and we use assertions to verify that: 1. The save method was called exactly once 2. The first argument of the call was the `user` object we passed in 3. The result contains both the original user data and the added ID We were able to test the `saveUser` operation without setting up or tearing down any complex database state. ### Clearing spies When working with multiple tests that use spies, it's important to reset or clear spies between tests to avoid interference. The Deno testing library provides a simple way to restore all spies to their original state using the `restore()` method. Here's how to clear a spy after you're done with it: ```ts import { assertEquals } from "jsr:@std/assert"; import { assertSpyCalls, spy } from "jsr:@std/testing/mock"; Deno.test("spy cleanup example", () => { // Create a spy on a function const myFunction = spy((x: number) => x * 2); // Use the spy const result = myFunction(5); assertEquals(result, 10); assertSpyCalls(myFunction, 1); // After testing, restore the spy try { // Test code using the spy // ... } finally { // Always clean up spies myFunction.restore(); } }); ``` Method spies are disposable, they can automatically restore themselves with the `using` keyword. This approach means that you do not need to wrap your assertions in a try statement to ensure you restore the methods before the tests finish. ```ts import { assertEquals } from "jsr:@std/assert"; import { assertSpyCalls, spy } from "jsr:@std/testing/mock"; Deno.test("using disposable spies", () => { const calculator = { add: (a: number, b: number) => a + b, multiply: (a: number, b: number) => a * b, }; // The spy will automatically be restored when it goes out of scope using addSpy = spy(calculator, "add"); // Use the spy const sum = calculator.add(3, 4); assertEquals(sum, 7); assertSpyCalls(addSpy, 1); assertEquals(addSpy.calls[0].args, [3, 4]); // No need for try/finally blocks - the spy will be restored automatically }); Deno.test("using multiple disposable spies", () => { const calculator = { add: (a: number, b: number) => a + b, multiply: (a: number, b: number) => a * b, }; // Both spies will automatically be restored using addSpy = spy(calculator, "add"); using multiplySpy = spy(calculator, "multiply"); calculator.add(5, 3); calculator.multiply(4, 2); assertSpyCalls(addSpy, 1); assertSpyCalls(multiplySpy, 1); // No cleanup code needed }); ``` For cases where you have multiple spies that don't support the `using` keyword, you can track them in an array and restore them all at once: ```ts Deno.test("multiple spies cleanup", () => { const spies = []; // Create spies const functionA = spy((x: number) => x + 1); spies.push(functionA); const objectB = { method: (x: number) => x * 2, }; const spyB = spy(objectB, "method"); spies.push(spyB); // Use the spies in tests // ... // Clean up all spies at the end try { // Test code using spies } finally { // Restore all spies spies.forEach((spyFn) => spyFn.restore()); } }); ``` By properly cleaning up spies, you ensure that each test starts with a clean state and avoid side effects between tests. ### Stubbing While spies track method calls without changing behavior, stubs replace the original implementation entirely. [Stubbing](https://jsr.io/@std/testing/doc/mock#stubbing) is a form of mocking where you temporarily replace a function or method with a controlled implementation. This allows you to simulate specific conditions or behaviors and return predetermined values. It can also be used when you need to override environment-dependent functionality. In Deno, you can create stubs using the `stub` function from the standard testing library: ```ts import { assertEquals } from "jsr:@std/assert"; import { Stub, stub } from "jsr:@std/testing/mock"; // Define types for better code quality interface User { name: string; role: string; } // Original function function getCurrentUser(userId: string): User { // Implementation that might involve database calls return { name: "Real User", role: "admin" }; } // Function we want to test function hasAdminAccess(userId: string): boolean { const user = getCurrentUser(userId); return user.role === "admin"; } Deno.test("hasAdminAccess with stubbed user", () => { // Create a stub that replaces getCurrentUser const getUserStub: Stub = stub( globalThis, "getCurrentUser", // Return a test user with non-admin role () => ({ name: "Test User", role: "guest" }), ); try { // Test with the stubbed function const result = hasAdminAccess("user123"); assertEquals(result, false); // You can also change the stub's behavior during the test getUserStub.restore(); // Remove first stub const adminStub = stub( globalThis, "getCurrentUser", () => ({ name: "Admin User", role: "admin" }), ); try { const adminResult = hasAdminAccess("admin456"); assertEquals(adminResult, true); } finally { adminStub.restore(); } } finally { // Always restore the original function getUserStub.restore(); } }); ``` Here we import the necessary functions from the Deno Standard Library, then we set up the function we're going to stub. In a real application this might connect to a database, make an API call, or perform other operations that we may want to avoid during testing. We set up the function under test, in this case the `hasAdminAccess()` function. We want to test whether it: - Calls the `getCurrentUser()` function to get a user object - Checks if the user's role is "admin" - Returns a boolean indicating whether the user has admin access Next we create a test named `hasAdminAccess with a stubbed user` and set up a stub for the `getCurrentUser` function. This will replace the real implementation with one that returns a user with a `guest` role. We run the test with the stubbed function, it will call `hasAdminAccess` with a user ID. Even though the real function would return a user with `admin` role, our stub returns `guest`, so we can assert that `hasAdminAccess` returns `false` (since our stub returns a non-admin user). We can change the stub behavior to return `admin` instead and assert that the function now returns `true`. At the end we use a `finally` block to ensure the original function is restored so that we don't accidentally affect other tests. ### Stubbing environment variables For deterministic testing, you often need to control environment variables. Deno's Standard Library provides utilities to achieve this: ```ts import { assertEquals } from "jsr:@std/assert"; import { stub } from "jsr:@std/testing/mock"; // Function that depends on environment variables and time function generateReport() { const environment = Deno.env.get("ENVIRONMENT") || "development"; const timestamp = new Date().toISOString(); return { environment, generatedAt: timestamp, data: {/* report data */}, }; } Deno.test("report generation with controlled environment", () => { // Stub environment const originalEnv = Deno.env.get; const envStub = stub(Deno.env, "get", (key: string) => { if (key === "ENVIRONMENT") return "production"; return originalEnv.call(Deno.env, key); }); // Stub time const dateStub = stub( Date.prototype, "toISOString", () => "2023-06-15T12:00:00Z", ); try { const report = generateReport(); // Verify results with controlled values assertEquals(report.environment, "production"); assertEquals(report.generatedAt, "2023-06-15T12:00:00Z"); } finally { // Always restore stubs to prevent affecting other tests envStub.restore(); dateStub.restore(); } }); ``` ### Faking time Time-dependent code can be challenging to test because it may produce different results based on when the test runs. Deno provides a [`FakeTime`](https://jsr.io/@std/testing/doc/time) utility that allows you to simulate the passage of time and control date-related functions during tests. The example below demonstrates how to test time-dependent functions: `isWeekend()`, which returns true if the current day is Saturday or Sunday, and `delayedGreeting()` which calls a callback after a 1-second delay: ```ts import { assertEquals } from "jsr:@std/assert"; import { FakeTime } from "jsr:@std/testing/time"; // Function that depends on the current time function isWeekend(): boolean { const date = new Date(); const day = date.getDay(); return day === 0 || day === 6; // 0 is Sunday, 6 is Saturday } // Function that works with timeouts function delayedGreeting(callback: (message: string) => void): void { setTimeout(() => { callback("Hello after delay"); }, 1000); // 1 second delay } Deno.test("time-dependent tests", () => { using fakeTime = new FakeTime(); // Create a fake time starting at a specific date (a Monday) const mockedTime: FakeTime = fakeTime(new Date("2023-05-01T12:00:00Z")); try { // Test with the mocked Monday assertEquals(isWeekend(), false); // Move time forward to Saturday mockedTime.tick(5 * 24 * 60 * 60 * 1000); // Advance 5 days assertEquals(isWeekend(), true); // Test async operations with timers let greeting = ""; delayedGreeting((message) => { greeting = message; }); // Advance time to trigger the timeout immediately mockedTime.tick(1000); assertEquals(greeting, "Hello after delay"); } finally { // Always restore the real time mockedTime.restore(); } }); ``` Here we set up a test which creates a controlled time environment with `fakeTime` which sets the starting date to May 1, 2023, (which was a Monday). It returns a `FakeTime` controller object that lets us manipulate time. We run tests with the mocked Monday and will see that the `isWeekend` function returns `false`. Then we can advance time to Saturday and run the test again to verify that `isWeekend` returns `true`. The `fakeTime` function replaces JavaScript's timing functions (`Date`, `setTimeout`, `setInterval`, etc.) with versions you can control. This allows you to test code with specific dates or times regardless of when the test runs. This powerful technique means you will avoid flaky tests that depend on the system clock and can speed up tests by advancing time instantly instead of waiting for real timeouts. Fake time is particularly useful for testing: - Calendar or date-based features, such as scheduling, appointments or expiration dates - Code with timeouts or intervals, such as polling, delayed operations or debouncing - Animations or transitions such as testing the completion of timed visual effects Like with stubs, always restore the real time functions after your tests using the `restore()` method to avoid affecting other tests. ## Advanced mocking patterns ### Partial mocking Sometimes you only want to mock certain methods of an object while keeping others intact: ```ts import { assertEquals } from "jsr:@std/assert"; import { stub } from "jsr:@std/testing/mock"; class UserService { async getUser(id: string) { // Complex database query return { id, name: "Database User" }; } async formatUser(user: { id: string; name: string }) { return { ...user, displayName: user.name.toUpperCase(), }; } async getUserFormatted(id: string) { const user = await this.getUser(id); return this.formatUser(user); } } Deno.test("partial mocking with stubs", async () => { const service = new UserService(); // Only mock the getUser method const getUserMock = stub( service, "getUser", () => Promise.resolve({ id: "test-id", name: "Mocked User" }), ); try { // The formatUser method will still use the real implementation const result = await service.getUserFormatted("test-id"); assertEquals(result, { id: "test-id", name: "Mocked User", displayName: "MOCKED USER", }); // Verify getUser was called with the right arguments assertEquals(getUserMock.calls.length, 1); assertEquals(getUserMock.calls[0].args[0], "test-id"); } finally { getUserMock.restore(); } }); ``` ### Mocking fetch requests Testing code that makes HTTP requests often requires mocking the `fetch` API: ```ts import { assertEquals } from "jsr:@std/assert"; import { stub } from "jsr:@std/testing/mock"; // Function that uses fetch async function fetchUserData(userId: string) { const response = await fetch(`https://api.example.com/users/${userId}`); if (!response.ok) { throw new Error(`Failed to fetch user: ${response.status}`); } return await response.json(); } Deno.test("mocking fetch API", async () => { const originalFetch = globalThis.fetch; // Create a response that the mock fetch will return const mockResponse = new Response( JSON.stringify({ id: "123", name: "John Doe" }), { status: 200, headers: { "Content-Type": "application/json" } }, ); // Replace fetch with a stubbed version globalThis.fetch = stub( globalThis, "fetch", (_input: string | URL | Request, _init?: RequestInit) => Promise.resolve(mockResponse), ); try { const result = await fetchUserData("123"); assertEquals(result, { id: "123", name: "John Doe" }); } finally { // Restore original fetch globalThis.fetch = originalFetch; } }); ``` ## Real-world example Let's put everything together in a more comprehensive example. We'll test a user authentication service that: 1. Validates user credentials 2. Calls an API to authenticate 3. Stores tokens with expiration times In the example below, we'll create a full `AuthService` class that handles user login, token management, and authentication. We'll test it thoroughly using various mocking techniques covered earlier: stubbing fetch requests, spying on methods, and manipulating time to test token expiration - all within organized test steps. Deno's testing API provides a useful `t.step()` function that allows you to organize your tests into logical steps or sub-tests. This makes complex tests more readable and helps pinpoint exactly which part of a test is failing. Each step can have its own assertions and will be reported separately in the test output. ```ts import { assertEquals, assertRejects } from "jsr:@std/assert"; import { spy, stub } from "jsr:@std/testing/mock"; import { FakeTime } from "jsr:@std/testing/time"; // The service we want to test class AuthService { private token: string | null = null; private expiresAt: Date | null = null; async login(username: string, password: string): Promise { // Validate inputs if (!username || !password) { throw new Error("Username and password are required"); } // Call authentication API const response = await fetch("https://api.example.com/login", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ username, password }), }); if (!response.ok) { throw new Error(`Authentication failed: ${response.status}`); } const data = await response.json(); // Store token with expiration (1 hour) this.token = data.token; this.expiresAt = new Date(Date.now() + 60 * 60 * 1000); return this.token; } getToken(): string { if (!this.token || !this.expiresAt) { throw new Error("Not authenticated"); } if (new Date() > this.expiresAt) { this.token = null; this.expiresAt = null; throw new Error("Token expired"); } return this.token; } logout(): void { this.token = null; this.expiresAt = null; } } Deno.test("AuthService comprehensive test", async (t) => { await t.step("login should validate credentials", async () => { const authService = new AuthService(); await assertRejects( () => authService.login("", "password"), Error, "Username and password are required", ); }); await t.step("login should handle API calls", async () => { const authService = new AuthService(); // Mock successful response const mockResponse = new Response( JSON.stringify({ token: "fake-jwt-token" }), { status: 200, headers: { "Content-Type": "application/json" } }, ); const fetchStub = stub( globalThis, "fetch", (_url: string | URL | Request, options?: RequestInit) => { // Verify correct data is being sent const body = options?.body as string; const parsedBody = JSON.parse(body); assertEquals(parsedBody.username, "testuser"); assertEquals(parsedBody.password, "password123"); return Promise.resolve(mockResponse); }, ); try { const token = await authService.login("testuser", "password123"); assertEquals(token, "fake-jwt-token"); } finally { fetchStub.restore(); } }); await t.step("token expiration should work correctly", () => { using fakeTime = new FakeTime(); const authService = new AuthService(); const time = fakeTime(new Date("2023-01-01T12:00:00Z")); try { // Mock the login process to set token directly authService.login = spy( authService, "login", async () => { (authService as any).token = "fake-token"; (authService as any).expiresAt = new Date( Date.now() + 60 * 60 * 1000, ); return "fake-token"; }, ); // Login and verify token authService.login("user", "pass").then(() => { const token = authService.getToken(); assertEquals(token, "fake-token"); // Advance time past expiration time.tick(61 * 60 * 1000); // Token should now be expired assertRejects( () => { authService.getToken(); }, Error, "Token expired", ); }); } finally { time.restore(); (authService.login as any).restore(); } }); }); ``` This code defines `AuthService` class with three main functionalities: - Login - Validates credentials, calls an API, and stores a token with an expiration time - GetToken - Returns the token if valid and not expired - Logout - Clears the token and expiration The testing structure is organized as a single main test with three logical **steps**, each testing a different aspect of the service; credential validation, API call handling and token expiration. 🦕 Effective mocking is essential for writing reliable, maintainable unit tests. Deno provides several powerful tools to help you isolate your code during testing. By mastering these mocking techniques, you'll be able to write more reliable tests that run faster and don't depend on external services. For more testing resources, check out: - [Deno Testing API Documentation](/api/deno/testing) - [Deno Standard Library Testing Modules](https://jsr.io/@std/testing) - [Basic Testing in Deno](/examples/testing_tutorial/) --- # Module metadata > A guide to working with module metadata in Deno. Learn about import.meta properties, main module detection, file paths, URL resolution, and how to access module context information in your applications. URL: https://docs.deno.com/examples/tutorials/module_metadata ## Concepts - [import.meta](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import.meta) can provide information on the context of the module. - The boolean [import.meta.main](https://docs.deno.com/api/web/~/ImportMeta#property_main) will let you know if the current module is the program entry point. - The string [import.meta.url](https://docs.deno.com/api/web/~/ImportMeta#property_url) will give you the URL of the current module. - The string [import.meta.filename](https://docs.deno.com/api/web/~/ImportMeta#property_filename) will give you the fully resolved path to the current module. _For local modules only_. - The string [import.meta.dirname](https://docs.deno.com/api/web/~/ImportMeta#property_dirname) will give you the fully resolved path to the directory containing the current module. _For local modules only_. - The [import.meta.resolve](https://docs.deno.com/api/web/~/ImportMeta#property_resolve) allows you to resolve specifier relative to the current module. This function takes into account an import map (if one was provided on startup). - The string [Deno.mainModule](https://docs.deno.com/api/deno/~/Deno.mainModule) will give you the URL of the main module entry point, i.e. the module invoked by the deno runtime. ## Example The example below uses two modules to show the difference between `import.meta.url`, `import.meta.main` and `Deno.mainModule`. In this example, `module_a.ts` is the main module entry point: ```ts title="module_b.ts" export function outputB() { console.log("Module B's import.meta.url", import.meta.url); console.log("Module B's mainModule url", Deno.mainModule); console.log( "Is module B the main module via import.meta.main?", import.meta.main, ); } ``` ```ts title="module_a.ts" import { outputB } from "./module_b.ts"; function outputA() { console.log("Module A's import.meta.url", import.meta.url); console.log("Module A's mainModule url", Deno.mainModule); console.log( "Is module A the main module via import.meta.main?", import.meta.main, ); console.log( "Resolved specifier for ./module_b.ts", import.meta.resolve("./module_b.ts"), ); } outputA(); console.log(""); outputB(); ``` If `module_a.ts` is located in `/home/alice/deno` then the output of `deno run --allow-read module_a.ts` is: ```console Module A's import.meta.url file:///home/alice/deno/module_a.ts Module A's mainModule url file:///home/alice/deno/module_a.ts Is module A the main module via import.meta.main? true Resolved specifier for ./module_b.ts file:///home/alice/deno/module_b.ts Module B's import.meta.url file:///home/alice/deno/module_b.ts Module B's mainModule url file:///home/alice/deno/module_a.ts Is module B the main module via import.meta.main? false ``` --- # How to use Mongoose with Deno > Step-by-step guide to using Mongoose with Deno. Learn how to set up MongoDB connectivity, create schemas, implement data models, and perform CRUD operations using Mongoose's schema-based modeling. URL: https://docs.deno.com/examples/tutorials/mongoose [Mongoose](https://mongoosejs.com/) is a popular, schema-based library that models data for [MongoDB](https://www.mongodb.com/). It simplifies writing MongoDB validation, casting, and other relevant business logic. This tutorial will show you how to setup Mongoose and MongoDB with your Deno project. [View source](https://github.com/denoland/examples/tree/main/with-mongoose) or [check out the video guide](https://youtu.be/dmZ9Ih0CR9g). ## Creating a Mongoose Model Let's create a simple app that connects to MongoDB, creates a `Dinosaur` model, and adds and updates a dinosaur to the database. First, we'll create the necessary files and directories: ```console touch main.ts && mkdir model && touch model/Dinosaur.ts ``` In `/model/Dinosaur.ts`, we'll import `npm:mongoose`, define the [schema], and export it: ```ts import { model, Schema } from "npm:mongoose@^6.7"; // Define schema. const dinosaurSchema = new Schema({ name: { type: String, unique: true }, description: String, createdAt: { type: Date, default: Date.now }, updatedAt: { type: Date, default: Date.now }, }); // Validations dinosaurSchema.path("name").required(true, "Dinosaur name cannot be blank."); dinosaurSchema.path("description").required( true, "Dinosaur description cannot be blank.", ); // Export model. export default model("Dinosaur", dinosaurSchema); ``` ## Connecting to MongoDB Now, in our `main.ts` file, we'll import mongoose and the `Dinosaur` schema, and connect to MongoDB: ```ts import mongoose from "npm:mongoose@^6.7"; import Dinosaur from "./model/Dinosaur.ts"; await mongoose.connect("mongodb://localhost:27017"); // Check to see connection status. console.log(mongoose.connection.readyState); ``` Because Deno supports top-level `await`, we're able to simply `await mongoose.connect()`. Running this, we should expect a log of `1`: ```shell $ deno run --allow-read --allow-sys --allow-env --allow-net main.ts 1 ``` It worked! ## Manipulating Data Let's add an instance [method](https://mongoosejs.com/docs/guide.html#methods) to our `Dinosaur` schema in `/model/Dinosaur.ts`: ```ts // ./model/Dinosaur.ts // Methods. dinosaurSchema.methods = { // Update description. updateDescription: async function (description: string) { this.description = description; return await this.save(); }, }; // ... ``` This instance method, `updateDescription`, will allow you to update a record's description. Back in `main.ts`, let's start adding and manipulating data in MongoDB. ```ts // main.ts // Create a new Dinosaur. const deno = new Dinosaur({ name: "Deno", description: "The fastest dinosaur ever lived.", }); // // Insert deno. await deno.save(); // Find Deno by name. const denoFromMongoDb = await Dinosaur.findOne({ name: "Deno" }); console.log( `Finding Deno in MongoDB -- \n ${denoFromMongoDb.name}: ${denoFromMongoDb.description}`, ); // Update description for Deno and save it. await denoFromMongoDb.updateDescription( "The fastest and most secure dinosaur ever lived.", ); // Check MongoDB to see Deno's updated description. const newDenoFromMongoDb = await Dinosaur.findOne({ name: "Deno" }); console.log( `Finding Deno (again) -- \n ${newDenoFromMongoDb.name}: ${newDenoFromMongoDb.description}`, ); ``` Running the code, we get: ```console Finding Deno in MongoDB -- Deno: The fastest dinosaur ever lived. Finding Deno (again) -- Deno: The fastest and most secure dinosaur ever lived. ``` Boom! For more info on using Mongoose, please refer to [their documentation](https://mongoosejs.com/docs/guide.html). --- # How to use MySQL2 with Deno > Step-by-step guide to using MySQL2 with Deno. Learn how to set up database connections, execute queries, handle transactions, and build data-driven applications using MySQL's Node.js driver. URL: https://docs.deno.com/examples/tutorials/mysql2 [MySQL](https://www.mysql.com/) is the most popular database in the [2022 Stack Overflow Developer Survey](https://survey.stackoverflow.co/2022/#most-popular-technologies-database) and counts Facebook, Twitter, YouTube, and Netflix among its users. [View source here.](https://github.com/denoland/examples/tree/main/with-mysql2) You can manipulate and query a MySQL database with Deno using the `mysql2` node package and importing via `npm:mysql2`. This allows us to use its Promise wrapper and take advantage of top-level await. ```tsx import mysql from "npm:mysql2@^2.3.3/promise"; ``` ## Connecting to MySQL We can connect to our MySQL server using the `createConnection()` method. You need the host (`localhost` if you are testing, or more likely a cloud database endpoint in production) and the user and password: ```tsx const connection = await mysql.createConnection({ host: "localhost", user: "root", password: "password", }); ``` You can also optionally specify a database during the connection creation. Here we are going to use `mysql2` to create the database on the fly. ## Creating and populating the database Now that you have the connection running, you can use `connection.query()` with SQL commands to create databases and tables as well as insert the initial data. First we want to generate and select the database to use: ```tsx await connection.query("CREATE DATABASE denos"); await connection.query("use denos"); ``` Then we want to create the table: ```tsx await connection.query( "CREATE TABLE `dinosaurs` ( `id` int NOT NULL AUTO_INCREMENT PRIMARY KEY, `name` varchar(255) NOT NULL, `description` varchar(255) )", ); ``` After the table is created we can populate the data: ```tsx await connection.query( "INSERT INTO `dinosaurs` (id, name, description) VALUES (1, 'Aardonyx', 'An early stage in the evolution of sauropods.'), (2, 'Abelisaurus', 'Abels lizard has been reconstructed from a single skull.'), (3, 'Deno', 'The fastest dinosaur that ever lived.')", ); ``` We now have all the data ready to start querying. ## Querying MySQL We can use the same connection.query() method to write our queries. First we try and get all the data in our `dinosaurs` table: ```tsx const results = await connection.query("SELECT * FROM `dinosaurs`"); console.log(results); ``` The result from this query is all the data in our database: ```tsx [ [ { id: 1, name: "Aardonyx", description: "An early stage in the evolution of sauropods." }, { id: 2, name: "Abelisaurus", description: `Abel's lizard" has been reconstructed from a single skull.` }, { id: 3, name: "Deno", description: "The fastest dinosaur that ever lived." } ], ``` If we want to just get a single element from the database, we can change our query: ```tsx const [results, fields] = await connection.query( "SELECT * FROM `dinosaurs` WHERE `name` = 'Deno'", ); console.log(results); ``` Which gives us a single row result: ```tsx [{ id: 3, name: "Deno", description: "The fastest dinosaur that ever lived." }]; ``` Finally, we can close the connection: ```tsx await connection.end(); ``` For more on `mysql2`, check out their documentation [here](https://github.com/sidorares/node-mysql2). --- # Build a Next.js App > Walkthrough guide to building a Next.js application with Deno. Learn how to set up a project, create API routes, implement server-side rendering, and build a full-stack TypeScript application. URL: https://docs.deno.com/examples/tutorials/next [Next.js](https://nextjs.org/) is a popular framework for building server-side-rendered applications. It is built on top of React and provides a lot of features out of the box. In this tutorial, we'll build a simple Next.js application and run it with Deno. The app will display a list of dinosaurs. When you click on one, it'll take you to a dinosaur page with more details. ![demo of the app](./images/how-to/next/dinoapp.gif) Start by verifying that you have the latest version of Deno installed, you will need at least Deno 1.46.0: ```sh deno --version ``` ## Create a Next.js app with Deno Next provides a CLI tool to quickly scaffold a new Next.js app. In your terminal run the following command to create a new Next.js app with Deno: ```sh deno run -A npm:create-next-app@latest ``` When prompted, select the default options to create a new Next.js app with TypeScript. Then, `cd` into the newly created project folder and run the following command to install the dependencies ```sh deno install ``` Next.js has some dependencies that still rely on `Object.prototype.__proto__`, so you need to allow it. In a new `deno.json` file, add the following lines: ```json deno.json { "unstable": ["unsafe-proto"] } ``` Now you can serve your new Next.js app: ```sh deno task dev ``` This will start the Next.js server, click the output link to localhost to see your app in the browser. ## Add a backend The next step is to add a backend API. We'll create a very simple API that returns information about dinosaurs. We'll use Next.js's [built in API route handlers](https://nextjs.org/docs/app/building-your-application/routing/route-handlers) to set up our dinosaur API. Next.js uses a file-system-based router, where the folder structure directly defines the routes. We'll define three routes, The first route at `/api` will return the string `Welcome to the dinosaur API`, then we'll set up `/api/dinosaurs` to return all the dinosaurs, and finally `/api/dinosaur/[dinosaur]` to return a specific dinosaur based on the name in the URL. ### /api/ In the `app` folder of your new project, create an `api` folder. In that folder, create a `route.ts` file, which will handle requests to `/api/. Copy and paste the following code into the `api/route.ts` file: ```ts title="route.ts" export async function GET() { return Response.json("welcome to the dinosaur API"); } ``` This code defines a simple route handler that returns a JSON response with the string `welcome to the dinosaur API`. ### /api/dinosaurs In the `api` folder, create a folder called `dinosaurs`. In that folder, make a `data.json` file, which will contain the hard coded dinosaur data. Copy and paste [this json file](https://raw.githubusercontent.com/denoland/deno-vue-example/main/api/data.json) into the `data.json` file. Create a `route.ts` file in the `dinosaurs` directory, which will handle requests to `/api/dinosaurs`. In this route we'll read the `data.json` file and return the dinosaurs as JSON: ```ts title="route.ts" import data from "./data.json" with { type: "json" }; export async function GET() { return Response.json(data); } ``` ### /api/dinosaurs/[dinosaur] And for the final route, `/api/dinosaurs/[dinosaur]`, we'll create a folder called `[dinosaur]` in the `dinosaurs` directory. In there, create a `route.ts` file. In this file we'll read the `data.json` file, find the dinosaur with the name in the URL, and return it as JSON: ```ts title="route.ts" import { NextRequest } from "next/server"; import data from "../data.json" with { type: "json" }; type RouteParams = { params: Promise<{ dinosaur: string }> }; export const GET = async (request: NextRequest, { params }: RouteParams) => { const { dinosaur } = await params; if (!dinosaur) { return Response.json("No dinosaur name provided."); } const dinosaurData = data.find((item) => item.name.toLowerCase() === dinosaur.toLowerCase() ); return Response.json(dinosaurData ? dinosaurData : "No dinosaur found."); }; ``` Now, if you run the app with `deno task dev` and visit `http://localhost:3000/api/dinosaurs/brachiosaurus` in your browser, you should see the details of the brachiosaurus dinosaur. ## Build the frontend Now that we have our backend API set up, let's build the frontend to display the dinosaur data. ### Define the dinosaur type Firstly we'll set up a new type, to define the shape of the dinosaur data. In the `app` directory, create a `types.ts` file and add the following code: ```ts title="types.ts" export type Dino = { name: string; description: string }; ``` ### Update the homepage We'll update the `page.tsx` file in the `app` directory to fetch the dinosaur data from our API and display it as a list of links. To execute client-side code in Next.js we need to use the `use Client` directive at the top of the file. Then we'll import the modules that we'll need in this page and export the default function that will render the page: ```tsx title="page.tsx" "use client"; import { useEffect, useState } from "react"; import { Dino } from "./types"; import Link from "next/link"; export default function Home() { } ``` Inside the body of the `Home` function, we'll define a state variable to store the dinosaur data, and a `useEffect` hook to fetch the data from the API when the component mounts: ```tsx title="page.tsx" const [dinosaurs, setDinosaurs] = useState([]); useEffect(() => { (async () => { const response = await fetch(`/api/dinosaurs`); const allDinosaurs = await response.json() as Dino[]; setDinosaurs(allDinosaurs); })(); }, []); ``` Beneath this, still inside the body of the `Home` function, we'll return a list of links, each linking to the dinosaur's page: ```tsx title="page.tsx" return (

Welcome to the Dinosaur app

Click on a dinosaur below to learn more.

    {dinosaurs.map((dinosaur: Dino) => { return (
  • {dinosaur.name}
  • ); })}
); ``` ### Create the dinosaur page Inside the `app` directory, create a new folder called `[dinosaur]`. Inside this folder create a `page.tsx` file. This file will fetch the details of a specific dinosaur from the API and render them on the page. Much like the homepage, we'll need client side code, and we'll import the modules we need and export a default function. We'll pass the incoming to the function and set up a type for this parameter: ```tsx title="[dinosaur]/page.tsx" "use client"; import { useEffect, useState } from "react"; import { Dino } from "../types"; import Link from "next/link"; type RouteParams = { params: Promise<{ dinosaur: string }> }; export default function Dinosaur({ params }: RouteParams) { } ``` Inside the body of the `Dinosaur` function we'll get the selected dinosaur from the request, set up a state variable to store the dinosaur data, and write a `useEffect` hook to fetch the data from the API when the component mounts: ```tsx title="[dinosaur]/page.tsx" const selectedDinosaur = params.then((params) => params.dinosaur); const [dinosaur, setDino] = useState({ name: "", description: "" }); useEffect(() => { (async () => { const resp = await fetch(`/api/dinosaurs/${await selectedDinosaur}`); const dino = await resp.json() as Dino; setDino(dino); })(); }, []); ``` Finally, still inside the `Dinosaur` function body, we'll return a paragraph element containing the dinosaur's name and description: ```tsx title="[dinosaur]/page.tsx" return (

{dinosaur.name}

{dinosaur.description}

🠠 Back to all dinosaurs
); ``` ## Run the app Now you can run the app with `deno task dev` and visit `http://localhost:3000` in your browser to see the list of dinosaurs. Click on a dinosaur to see more details! ![demo of the app](./images/how-to/next/dinoapp.gif) 🦕 Now you can build and run a Next.js app with Deno! To build on your app you could consider [adding a database](/runtime/tutorials/connecting_to_databases/) to replace your `data.json` file, or consider [writing some tests](/runtime/fundamentals/testing/) to make your app reliable and production ready. --- # Build a Nuxt app with Deno > Step-by-step guide to building Nuxt applications with Deno. Learn how to create a full-stack Vue.js app, implement server-side rendering, add Tailwind styling, and deploy your application. URL: https://docs.deno.com/examples/tutorials/nuxt [Nuxt](https://nuxt.com/) is a framework that provides an intuitive way to create full-stack applications based on [Vue](https://vuejs.org/). It offers file-based routing, a variety of rendering options, and automatic code splitting out of the box. With its modular architecture, Nuxt simplifies the development process by providing a structured approach to building Vue applications. In this tutorial, we'll build a simple Nuxt application with Deno that will display a list of dinosaurs and allow you to learn more about each one when you click on the name: - [Scaffold a Nuxt app](#scaffold-a-nuxt-app-with-deno) - [Setup server API routes](#setup-server-api-routes) - [Setup Vue frontend](#setup-vue-frontend) - [Add Tailwind](#add-tailwind) - [Next steps](#next-steps) You can find the code for this project in this [repo](https://github.com/denoland/examples/tree/main/with-nuxt). ## Scaffold a Nuxt app with Deno We can create a new Nuxt project using Deno like this: ```bash deno -A npm:nuxi@latest init ``` We'll use Deno to manage our package dependencies, and can grab the Nuxt package from npm. This will create a nuxt-app with this project structure: ``` NUXT-APP/ ├── .nuxt/ # Nuxt build directory ├── node_modules/ # Node.js dependencies ├── public/ # Static files │ ├── favicon.ico │ └── robots.txt ├── server/ # Server-side code │ └── tsconfig.json ├── .gitignore ├── app.vue # Root Vue component ├── nuxt.config.ts # Nuxt configuration ├── package-lock.json # NPM lock file ├── package.json # Project manifest ├── README.md └── tsconfig.json # TypeScript configuration ``` ## Setup server API routes Let’s first start by creating the API routes that serve the dinosaur data. First, our [dinosaur data](https://github.com/denoland/examples/blob/main/with-nuxt/server/api/data.json) will live within the server directory as `server/api/data.json`: ```json title="server/api/data.json" [ { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." }, { "name": "Abelisaurus", "description": "\"Abel's lizard\" has been reconstructed from a single skull." }, { "name": "Abrictosaurus", "description": "An early relative of Heterodontosaurus." }, ...etc ] ``` This is where our data will be pulled from. In a full application, this data would come from a database. > ⚠️️ In this tutorial we hard code the data. But you can connect > to [a variety of databases](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and [even use ORMs like Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) with > Deno. This app will have two API routes. They will serve the following: - the full list of dinosaurs for an index page - individual dinosaur information for an individual dinosaur page Both will be `*.get.ts` files, which Nuxt automatically converts to API endpoints to respond to `GET` requests. [The filename convention determines both the HTTP method and the route path](https://nuxt.com/docs/guide/directory-structure/server#matching-http-method). The initial `dinosaurs.get.ts` is fairly simple and uses [`defineCachedEventHandler`](https://nitro.build/guide/cache) to create a cached endpoint for better performance. This handler simply returns our full dinosaur data array without any filtering: ```tsx title="server/api/dinosaurs.get.ts" import data from "./data.json" with { type: "json" }; export default defineCachedEventHandler(() => { return data; }); ``` The `GET` route for the individual dinosaur has a little more logic. It extracts the name parameter from the event context, performs case-insensitive matching to find the requested dinosaur, and includes proper error handling for missing or invalid dinosaur names. We'll create a `dinosaurs` directory, then to pass the name parameter, we'll make a new file named `[name].get.ts`: ```tsx title="server/api/dinosaurs/[name].get.ts" import data from "../data.json"; export default defineCachedEventHandler((event) => { const name = getRouterParam(event, "name"); if (!name) { throw createError({ statusCode: 400, message: "No dinosaur name provided", }); } const dinosaur = data.find( (dino) => dino.name.toLowerCase() === name.toLowerCase(), ); if (!dinosaur) { throw createError({ statusCode: 404, message: "Dinosaur not found", }); } return dinosaur; }); ``` Run the server with `deno task dev` and visit [http://localhost:3000/api/dinosaurs](http://localhost:3000/api/dinosaurs) in your browser, you should see the raw JSON response showing all of the dinosaurs! ![Setting up API](./images/how-to/nuxt/nuxt-1.webp) You can also retrieve data for a single dinosaur by visiting a particular dinosaur name, for example: [http://localhost:3000/api/dinosaurs/aardonyx](http://localhost:3000/api/dinosaurs/aardonyx). ![Setting up API](./images/how-to/nuxt/nuxt-2.webp) Next, we'll setup the frontend with Vue to display the index page and each individual dinosaur page. ## Setup the Vue frontend We want to set up two pages within the app: - An index page which will list all of the dinosaurs - An individual dinosaur page showing more information about the selected dinosaur. First, create the index page. Nuxt uses [file-system routing](https://nuxt.com/docs/getting-started/routing), so we will create a `pages` directory in the root, and within that an index page called `index.vue`. To get the data, we’ll use the `useFetch` composable to hit the API endpoint we created in the previous section: ```tsx title="pages/index.vue" ``` For the page that shows information on each dinosaur, we'll create a new dynamic page called `[name].vue`. This page uses Nuxt's [dynamic route parameters](https://nuxt.com/docs/getting-started/routing#route-parameters), where the `[name]` in the filename can be accessed in JavaScript as `route.params.name`. We’ll use the `useRoute` composable to access the route parameters and `useFetch` to get the specific dinosaur's data based on the name parameter: ```tsx title="pages/[name].vue" ``` Next, we’ll have to connect these Vue components together so that they render properly when we visit the root of the domain. Let’s update `app.vue` at the root of the directory to serve our application’s root component. We’ll use [`NuxtLayout`](https://nuxt.com/docs/api/components/nuxt-layout) for consistent page structure and [`NuxtPage`](https://nuxt.com/docs/api/components/nuxt-page) for dynamic page rendering: ```tsx title="app.vue" ; ``` Run the server with `deno task dev` and see how it looks at [http://localhost:3000](http://localhost:3000):
Looks great! ## Add Tailwind Like we said, we're going to add a little bit of styling to this application. First, we'll set up a layout which will provide a consistent structure across all pages using Nuxt's layout system with [slot-based](https://vuejs.org/guide/components/slots) content injection: ```tsx title="layouts/default.vue" ; ``` In this project, we’re also going to use [tailwind](https://tailwindcss.com/) for some basic design, so we need to install those dependencies: ```bash deno install -D npm:tailwindcss npm:@tailwindcss/vite ``` Then, we're going to update the `nuxt.config.ts`. Import the Tailwind dependency and configure the Nuxt application for Deno compatibility, We'll enable development tools, and set up Tailwind CSS: ```tsx title="nuxt.config.ts" import tailwindcss from "@tailwindcss/vite"; export default defineNuxtConfig({ compatibilityDate: "2025-05-15", devtools: { enabled: true }, nitro: { preset: "deno", }, app: { head: { title: "Dinosaur Encyclopedia", }, }, css: ["~/assets/css/main.css"], vite: { plugins: [ tailwindcss(), ], }, }); ``` Next, create a new css file, `assets/css/main.css`, and add an import `@import` that imports tailwind, as well as the tailwind utilities: ```tsx title="assets/css/main.css" @import "tailwindcss"; @tailwind base; @tailwind components; @tailwind utilities; ``` ## Running the application We can then run the application using: ```bash deno task dev ``` This will start the app at localhost:3000:
And we’re done! 🦕 Next steps for a Nuxt app might be to add authentication using the [Nuxt Auth](https://auth.nuxtjs.org/) module, implement state management with [Pinia](https://pinia.vuejs.org/), add server-side data persistence with [Prisma](https://docs.deno.com/examples/prisma_tutorial/) or [MongoDB](https://docs.deno.com/examples/mongoose_tutorial/), and set up automated testing with Vitest. These features would make it production-ready for larger applications. --- # Handle OS signals > Tutorial on handling operating system signals in Deno. Learn how to capture SIGINT and SIGBREAK events, manage signal listeners, and implement graceful shutdown handlers in your applications. URL: https://docs.deno.com/examples/tutorials/os_signals > ⚠️ Windows only supports listening for SIGINT and SIGBREAK as of Deno v1.23. ## Concepts - [Deno.addSignalListener()](https://docs.deno.com/api/deno/~/Deno.addSignalListener) can be used to capture and monitor OS signals. - [Deno.removeSignalListener()](https://docs.deno.com/api/deno/~/Deno.removeSignalListener) can be used to stop watching the signal. ## Set up an OS signal listener APIs for handling OS signals are modelled after already familiar [`addEventListener`](https://developer.mozilla.org/en-US/docs/Web/API/EventTarget/addEventListener) and [`removeEventListener`](https://developer.mozilla.org/en-US/docs/Web/API/EventTarget/removeEventListener) APIs. > ⚠️ Note that listening for OS signals doesn't prevent event loop from > finishing, ie. if there are no more pending async operations the process will > exit. You can use `Deno.addSignalListener()` function for handling OS signals: ```ts title="add_signal_listener.ts" console.log("Press Ctrl-C to trigger a SIGINT signal"); Deno.addSignalListener("SIGINT", () => { console.log("interrupted!"); Deno.exit(); }); // Add a timeout to prevent process exiting immediately. setTimeout(() => {}, 5000); ``` Run with: ```shell deno run add_signal_listener.ts ``` You can use `Deno.removeSignalListener()` function to unregister previously added signal handler. ```ts title="signal_listeners.ts" console.log("Press Ctrl-C to trigger a SIGINT signal"); const sigIntHandler = () => { console.log("interrupted!"); Deno.exit(); }; Deno.addSignalListener("SIGINT", sigIntHandler); // Add a timeout to prevent process exiting immediately. setTimeout(() => {}, 5000); // Stop listening for a signal after 1s. setTimeout(() => { Deno.removeSignalListener("SIGINT", sigIntHandler); }, 1000); ``` Run with: ```shell deno run signal_listeners.ts ``` --- # Distributed Tracing with Context Propagation in Deno > Implement end-to-end distributed tracing with automatic context propagation in Deno applications. This tutorial covers creating traced services, automatic propagation of trace context, and visualizing distributed traces. URL: https://docs.deno.com/examples/tutorials/otel_span_propagation Modern applications are often built as distributed systems with multiple services communicating with each other. When debugging issues or optimizing performance in these systems, it's crucial to be able to trace requests as they flow through different services. This is where distributed tracing comes in. As of Deno 2.3, the runtime now automatically preserves trace context across service boundaries, making end-to-end tracing in distributed systems simpler and more powerful. This means that when one service makes a request to another, the trace context is automatically propagated, allowing you to see the entire request flow as a single trace. ## Setting up a distributed system Our example system will consist of two parts: 1. A server that provides an API endpoint 2. A client that makes requests to the server ### The server We'll set up a simple HTTP server that responds to GET requests with a JSON message: ```ts title="server.ts" import { trace } from "npm:@opentelemetry/api@1"; const tracer = trace.getTracer("api-server", "1.0.0"); // Create a simple API server with Deno.serve Deno.serve({ port: 8000 }, (req) => { return tracer.startActiveSpan("process-api-request", async (span) => { // Add attributes to the span for better context span.setAttribute("http.route", "/"); span.updateName("GET /"); // Add a span event to see in traces span.addEvent("processing_request", { request_id: crypto.randomUUID(), timestamp: Date.now(), }); // Simulate processing time await new Promise((resolve) => setTimeout(resolve, 50)); console.log("Server: Processing request in trace context"); // End the span when we're done span.end(); return new Response(JSON.stringify({ message: "Hello from server!" }), { headers: { "Content-Type": "application/json" }, }); }); }); ``` ### The client Now, let's create a client that will make requests to our server: ```ts title="client.ts" import { SpanStatusCode, trace } from "npm:@opentelemetry/api@1"; const tracer = trace.getTracer("api-client", "1.0.0"); // Create a parent span for the client operation await tracer.startActiveSpan("call-api", async (parentSpan) => { try { console.log("Client: Starting API call"); // The fetch call inside this span will automatically: // 1. Create a child span for the fetch operation // 2. Inject the trace context into the outgoing request headers const response = await fetch("http://localhost:8000/"); const data = await response.json(); console.log(`Client: Received response: ${JSON.stringify(data)}`); parentSpan.addEvent("received_response", { status: response.status, timestamp: Date.now(), }); } catch (error) { console.error("Error calling API:", error); if (error instanceof Error) { parentSpan.recordException(error); } parentSpan.setStatus({ code: SpanStatusCode.ERROR, message: error instanceof Error ? error.message : String(error), }); } finally { parentSpan.end(); } }); ``` ## Tracing with OpenTelemetry Both the client and server code already include basic OpenTelemetry instrumentation: 1. Create a tracer - both files create a tracer using `trace.getTracer()` with a name and version. 2. Create spans - We use `startActiveSpan()` to create spans that represent operations. 3. Add context - We add attributes and events to spans to provide more context. 4. Ending spans - We make sure to end spans when operations are complete. ## Automatic context propagation The magic happens when the client makes a request to the server. In the client code there is a fetch call to the server: ```ts const response = await fetch("http://localhost:8000/"); ``` Since this fetch call happens inside an active span, Deno automatically creates a child span for the fetch operation and Injects the trace context into the outgoing request headers. When the server receives this request, Deno extracts the trace context from the request headers and establishes the server span as a child of the client's span. ## Running the example To run this example, first, start the server, giving your otel service a name: ```sh OTEL_DENO=true OTEL_SERVICE_NAME=server deno run --unstable-otel --allow-net server.ts ``` Then, in another terminal, run the client, giving the client a different service name to make observing the propagation clearer: ```sh OTEL_DENO=true OTEL_SERVICE_NAME=client deno run --unstable-otel --allow-net client.ts ``` You should see: 1. The client logs "Client: Starting API call" 2. The server logs "Server: Processing request in trace context" 3. The client logs the response received from the server ## Viewing traces To actually see the traces, you'll need an OpenTelemetry collector and a visualization tool, [for example Grafana Tempo](/runtime/fundamentals/open_telemetry/#quick-start). When you visualize the traces, you'll see: 1. A parent span from the client 2. Connected to a child span for the HTTP request 3. Connected to a span from the server 4. All as part of a single trace! For example, in Grafana, the trace visualization may look like this: ![Viewing expanded traces in Grafana](./images/how-to/grafana/propagation.png) 🦕 Now that you understand distributed tracing with Deno, you could extend this to more complex systems with multiple services and async operations. With Deno's automatic context propagation, implementing distributed tracing in your applications has never been easier! --- # How to use Planetscale with Deno > Step-by-step guide to using Planetscale with Deno. Learn how to set up serverless MySQL databases, manage connections, execute queries, and build scalable applications with Planetscale's developer-friendly platform. URL: https://docs.deno.com/examples/tutorials/planetscale Planetscale is a MySQL-compatible serverless database that is designed with a developer workflow where developers can create, branch, and deploy databases from the command line. [View source here.](https://github.com/denoland/examples/tree/main/with-planetscale) We'll use the Planetscale serverless driver, `@planetscale/database`, to work with Deno. First we want to create `main.ts` and import the connect method from this package: ```tsx import { connect } from "npm:@planetscale/database@^1.4"; ``` ## Configuring our connection The connection requires three credentials: host, username, and password. These are database-specific, so we first need to create a database in Planetscale. You can do that by following the initial instructions [here](https://planetscale.com/docs/tutorials/planetscale-quick-start-guide). Don't worry about adding the schema—we can do that through `@planetscale/database`. Once you have created the database, head to Overview, click "Connect", and choose "Connect with `@planetscale/database`" to get the host and username. Then click through to Passwords to create a new password for your database. Once you have all three you can plug them in directly, or better, store them as environment variables: ```bash export HOST= export USERNAME= export PASSWORD= ``` Then call them using `Deno.env`: ```tsx const config = { host: Deno.env.get("HOST"), username: Deno.env.get("USERNAME"), password: Deno.env.get("PASSWORD"), }; const conn = connect(config); ``` This will also work on Deno Deploy if you set the environment variables in the dashboard. Run with: ```shell deno run --allow-net --allow-env main.ts ``` The `conn` object is now an open connection to our Planetscale database. ## Creating and populating our database table Now that you have the connection running, you can `conn.execute()` with SQL commands to create tables and insert the initial data: ```tsx await conn.execute( "CREATE TABLE dinosaurs (id int NOT NULL AUTO_INCREMENT PRIMARY KEY, name varchar(255) NOT NULL, description varchar(255) NOT NULL);", ); await conn.execute( "INSERT INTO `dinosaurs` (id, name, description) VALUES (1, 'Aardonyx', 'An early stage in the evolution of sauropods.'), (2, 'Abelisaurus', 'Abels lizard has been reconstructed from a single skull.'), (3, 'Deno', 'The fastest dinosaur that ever lived.')", ); ``` ## Querying Planetscale We can use same `conn.execute()` to also write our queries. Let's get a list of all our dinosaurs: ```tsx const results = await conn.execute("SELECT * FROM `dinosaurs`"); console.log(results.rows); ``` The result: ```tsx [ { id: 1, name: "Aardonyx", description: "An early stage in the evolution of sauropods.", }, { id: 2, name: "Abelisaurus", description: "Abels lizard has been reconstructed from a single skull.", }, { id: 3, name: "Deno", description: "The fastest dinosaur that ever lived." }, ]; ``` We can also get just a single row from the database by specifying a dinosaur name: ```tsx const result = await conn.execute( "SELECT * FROM `dinosaurs` WHERE `name` = 'Deno'", ); console.log(result.rows); ``` Which gives us a single row result: ```tsx [{ id: 3, name: "Deno", description: "The fastest dinosaur that ever lived." }]; ``` You can find out more about working with Planetscale in their [docs](https://planetscale.com/docs). --- # How to create a RESTful API with Prisma and Oak > Guide to building a RESTful API using Prisma and Oak with Deno. Learn how to set up database schemas, generate clients, implement CRUD operations, and deploy your API with proper type safety. URL: https://docs.deno.com/examples/tutorials/prisma [Prisma](https://prisma.io) has been one of our top requested modules to work with in Deno. The demand is understandable, given that Prisma's developer experience is top notch and plays well with so many persistent data storage technologies. We're excited to show you how to use Prisma with Deno. In this How To guide, we'll setup a simple RESTful API in Deno using Oak and Prisma. Let's get started. [View source](https://github.com/denoland/examples/tree/main/with-prisma) or [check out the video guide](https://youtu.be/P8VzA_XSF8w). ## Setup the application Let's create the folder `rest-api-with-prisma-oak` and navigate there: ```shell mkdir rest-api-with-prisma-oak cd rest-api-with-prisma-oak ``` Then, let's run `prisma init` with Deno: ```shell deno run --allow-read --allow-env --allow-write npm:prisma@latest init ``` This will generate [`prisma/schema.prisma`](https://www.prisma.io/docs/concepts/components/prisma-schema). Let's update it with the following: ```ts generator client { provider = "prisma-client-js" previewFeatures = ["deno"] output = "../generated/client" } datasource db { provider = "postgresql" url = env("DATABASE_URL") } model Dinosaur { id Int @id @default(autoincrement()) name String @unique description String } ``` Prisma also generates a `.env` file with a `DATABASE_URL` environment variable. Let's assign `DATABASE_URL` to a PostgreSQL connection string. In this example, we'll use a free [PostgreSQL database from Supabase](https://supabase.com/database). Next, let's create the database schema: ```shell deno run -A npm:prisma@latest db push ``` After that's complete, we'll need to generate a Prisma Client: ```shell deno run -A --unstable-detect-cjs npm:prisma@latest generate --no-engine ``` ## Setup Accelerate in the Prisma Data Platform To get started with the Prisma Data Platform: 1. Sign up for a free [Prisma Data Platform account](https://console.prisma.io). 2. Create a project. 3. Navigate to the project you created. 4. Enable Accelerate by providing your database's connection string. 5. Generate an Accelerate connection string and copy it to your clipboard. Assign the Accelerate connection string, that begins with `prisma://`, to `DATABASE_URL` in your `.env` file replacing your existing connection string. Next, let's create a seed script to seed the database. ## Seed your Database Create `./prisma/seed.ts`: ```shell touch prisma/seed.ts ``` And in `./prisma/seed.ts`: ```ts import { Prisma, PrismaClient } from "../generated/client/deno/edge.ts"; const prisma = new PrismaClient({ datasourceUrl: envVars.DATABASE_URL, }); const dinosaurData: Prisma.DinosaurCreateInput[] = [ { name: "Aardonyx", description: "An early stage in the evolution of sauropods.", }, { name: "Abelisaurus", description: "Abel's lizard has been reconstructed from a single skull.", }, { name: "Acanthopholis", description: "No, it's not a city in Greece.", }, ]; /** * Seed the database. */ for (const u of dinosaurData) { const dinosaur = await prisma.dinosaur.create({ data: u, }); console.log(`Created dinosaur with id: ${dinosaur.id}`); } console.log(`Seeding finished.`); await prisma.$disconnect(); ``` We can now run `seed.ts` with: ```shell deno run -A --env prisma/seed.ts ``` > [!TIP] > > The `--env` flag is used to tell Deno to load environment variables from the > `.env` file. After doing so, you should be able to see your data on Prisma Studio by running the following command: ```bash deno run -A npm:prisma studio ``` You should see something similar to the following screenshot: ![New dinosaurs are in Prisma dashboard](./images/how-to/prisma/1-dinosaurs-in-prisma.png) ## Create your API routes We'll use [`oak`](https://jsr.io/@oak/oak) to create the API routes. Let's keep them simple for now. Let's create a `main.ts` file: ```shell touch main.ts ``` Then, in your `main.ts` file: ```ts import { PrismaClient } from "./generated/client/deno/edge.ts"; import { Application, Router } from "jsr:@oak/oak"; /** * Initialize. */ const prisma = new PrismaClient({ datasources: { db: { url: envVars.DATABASE_URL, }, }, }); const app = new Application(); const router = new Router(); /** * Setup routes. */ router .get("/", (context) => { context.response.body = "Welcome to the Dinosaur API!"; }) .get("/dinosaur", async (context) => { // Get all dinosaurs. const dinosaurs = await prisma.dinosaur.findMany(); context.response.body = dinosaurs; }) .get("/dinosaur/:id", async (context) => { // Get one dinosaur by id. const { id } = context.params; const dinosaur = await prisma.dinosaur.findUnique({ where: { id: Number(id), }, }); context.response.body = dinosaur; }) .post("/dinosaur", async (context) => { // Create a new dinosaur. const { name, description } = await context.request.body("json").value; const result = await prisma.dinosaur.create({ data: { name, description, }, }); context.response.body = result; }) .delete("/dinosaur/:id", async (context) => { // Delete a dinosaur by id. const { id } = context.params; const dinosaur = await prisma.dinosaur.delete({ where: { id: Number(id), }, }); context.response.body = dinosaur; }); /** * Setup middleware. */ app.use(router.routes()); app.use(router.allowedMethods()); /** * Start server. */ await app.listen({ port: 8000 }); ``` Now, let's run it: ```shell deno run -A --env main.ts ``` Let's visit `localhost:8000/dinosaurs`: ![List of all dinosaurs from REST API](./images/how-to/prisma/2-dinosaurs-from-api.png) Next, let's `POST` a new user with this `curl` command: ```shell curl -X POST http://localhost:8000/dinosaur -H "Content-Type: application/json" -d '{"name": "Deno", "description":"The fastest, most secure, easiest to use Dinosaur ever to walk the Earth."}' ``` You should now see a new row on Prisma Studio: ![New dinosaur Deno in Prisma](./images/how-to/prisma/3-new-dinosaur-in-prisma.png) Nice! ## What's next? Building your next app will be more productive and fun with Deno and Prisma, since both technologies deliver an intuitive developer experience with data modeling, type-safety, and robust IDE support. If you're interested in connecting Prisma to Deno Deploy, [check out this awesome guide](https://www.prisma.io/docs/guides/deployment/deployment-guides/deploying-to-deno-deploy). --- # Build Qwik with Deno > Step-by-step guide to building Qwik applications with Deno. Learn about resumability, server-side rendering, route handling, and how to create fast, modern web applications with zero client-side JavaScript by default. URL: https://docs.deno.com/examples/tutorials/qwik [Qwik](https://qwik.dev/) is a JavaScript framework that delivers instant-loading web applications by leveraging resumability instead of hydration. In this tutorial, we'll build a simple Qwik application and run it with Deno. The app will display a list of dinosaurs. When you click on one, it'll take you to a dinosaur page with more details. We'll go over how to build a simple Qwik app using Deno: - [Scaffold a Qwik app](#scaffold-a-qwik-app) - [Setup data and type definitions](#setup-data-and-type-definitions) - [Build the frontend](#build-the-frontend) - [Next steps](#next-steps) Feel free to skip directly to [the source code](https://github.com/denoland/examples/tree/main/with-qwik) or follow along below! ## Scaffold a Qwik app We can create a new Qwik project using deno like this: ```bash deno init --npm qwik@latest ``` This will run you through the setup process for Qwik and Qwik City. Here, we chose the simplest “Empty App” deployment with npm dependencies. When complete, you’ll have a project structure that looks like this: ``` . ├── node_modules/ ├── public/ └── src/ ├── components/ │ └── router-head/ │ └── router-head.tsx └── routes/ ├── index.tsx ├── layout.tsx ├── service-worker.ts ├── entry.dev.tsx ├── entry.preview.tsx ├── entry.ssr.tsx ├── global.css └── root.tsx ├── .eslintignore ├── .eslintrc.cjs ├── .gitignore ├── .prettierignore ├── package-lock.json ├── package.json ├── qwik.env.d.ts ├── README.md ├── tsconfig.json └── vite.config.ts ``` Most of this is boilerplate configuration that we won’t touch. A few of the important files to know for how Qwik works are: - `src/components/router-head/router-head.tsx`: Manages the HTML head elements (like title, meta tags, etc.) across different routes in your Qwik application. - `src/routes/index.tsx`: The main entry point and home page of your application that users see when they visit the root URL. - `src/routes/layout.tsx`: Defines the common layout structure that wraps around pages, allowing you to maintain consistent UI elements like headers and footers. - `src/routes/service-worker.ts`: Handles Progressive Web App (PWA) functionality, offline caching, and background tasks for your application. - `src/routes/entry.ssr.tsx`: Controls how your application is server-side rendered, managing the initial HTML generation and hydration process. - `src/routes/root.tsx`: The root component that serves as the application's shell, containing global providers and the main routing structure. Now we can build out our own routes and files within the application. ## Setup data and type definitions We’ll start by adding our [dinosaur data](https://github.com/denoland/examples/blob/main/with-qwik/src/data/dinosaurs.json) to a new `./src/data` directory as `dinosaurs.json`: ```jsonc // ./src/data/dinosaurs.json { "dinosaurs": [ { "name": "Tyrannosaurus Rex", "description": "A massive carnivorous dinosaur with powerful jaws and tiny arms." }, { "name": "Brachiosaurus", "description": "A huge herbivorous dinosaur with a very long neck." }, { "name": "Velociraptor", "description": "A small but fierce predator that hunted in packs." } // ... ] } ``` This is where our data will be pulled from. In a full application, this data would come from a database. > ⚠️️ In this tutorial we hard code the data. But you can connect > to [a variety of databases](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and [even use ORMs like Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) with > Deno. Next, let's add type definitions for our dinosaur data. We'll put it in `types.ts` in `./src/`: ```tsx // ./src/types.ts export type Dino = { name: string; description: string; }; ``` Next, let's add API routes to server this data. ## Add API routes First, let's create the route to load all dinosaurs for the index page. This API endpoint uses Qwik City's [`RequestHandler`](https://qwik.dev/docs/advanced/request-handling/) to create a `GET` endpoint that loads and returns our dinosaur data using the json helper for proper response formatting. We'll add the below to a new file in `./src/routes/api/dinosaurs/index.ts`: ```tsx // ./src/routes/api/dinosaurs/index.ts import { RequestHandler } from "@builder.io/qwik-city"; import data from "~/data/dinosaurs.json" with { type: "json" }; export const onGet: RequestHandler = async ({ json }) => { const dinosaurs = data; json(200, dinosaurs); }; ``` Next, let's create the API route to get the information for a single dinosaur. This takes the parameter from the URL and uses it to search through our dinosaur data. We'll add the below code to `./src/routes/api/dinosaurs/[name]/index.ts`: ```tsx // ./src/routes/api/dinosaurs/[name]/index.ts import { RequestHandler } from "@builder.io/qwik-city"; import data from "~/data/dinosaurs.json" with { type: "json" }; export const onGet: RequestHandler = async ({ params, json }) => { const { name } = params; const dinosaurs = data; if (!name) { json(400, { error: "No dinosaur name provided." }); return; } const dinosaur = dinosaurs.find( (dino) => dino.name.toLowerCase() === name.toLowerCase(), ); if (!dinosaur) { json(404, { error: "No dinosaur found." }); return; } json(200, dinosaur); }; ``` Now that the API routes are wired up and serving data, let's create the two frontend pages: the index page and the individual dinosaur detail pages. ## Build the frontend We'll create our homepage by updating our `./src/routes/index.tsx` file using Qwik's [`routeLoader$`](https://qwik.dev/docs/route-loader/) for server-side data fetching. This `component$` loads and renders the dinosaur data during SSR via `useDinosaurs()`: ```tsx // ./src/routes/index.tsx import { component$ } from "@builder.io/qwik"; import { Link, routeLoader$ } from "@builder.io/qwik-city"; import type { Dino } from "~/types"; import data from "~/data/dinosaurs.json" with { type: "json" }; export const useDinosaurs = routeLoader$(() => { return data; }); export default component$(() => { const dinosaursSignal = useDinosaurs(); return (

Welcome to the Dinosaur app

Click on a dinosaur below to learn more.

    {dinosaursSignal.value.dinosaurs.map((dinosaur: Dino) => (
  • {dinosaur.name}
  • ))}
); }); ``` Now that we have our main index page, let's add a page for the individual dinosaur information. We'll use Qwik's [dynamic routing](https://qwik.dev/docs/routing/), with `[name]` as the key for each dinosaur. This page leverages `routeLoader$` to fetch individual dinosaur details based on the URL parameter, with built-in error handling if the dinosaur isn't found. The component uses the same SSR pattern as our index page, but with parameter-based data loading and a simpler display layout for individual dinosaur details: ```tsx // ./src/routes/[name]/index.tsx import { component$ } from "@builder.io/qwik"; import { Link, routeLoader$ } from "@builder.io/qwik-city"; import type { Dino } from "~/types"; import data from "~/data/dinosaurs.json" with { type: "json" }; export const useDinosaurDetails = routeLoader$(({ params }): Dino => { const { dinosaurs } = data; const dinosaur = dinosaurs.find( (dino: Dino) => dino.name.toLowerCase() === params.name.toLowerCase(), ); if (!dinosaur) { throw new Error("Dinosaur not found"); } return dinosaur; }); export default component$(() => { const dinosaurSignal = useDinosaurDetails(); return (

{dinosaurSignal.value.name}

{dinosaurSignal.value.description}

Back to all dinosaurs
); }); ``` Now that we have built our routes and the frontend components, we can run our application: ```bash deno task dev ``` This will start the app at `localhost:5173`:
Tada! ## Next steps 🦕 Now you can build and run a Qwik app with Deno! Here are some ways you could enhance your dinosaur application: Next steps for a Qwik app might be to use Qwik's lazy loading capabilities for dinosaur images and other components, or add client-side state management for complex features. - Add persistent data store [using a database like Postgres or MongoDB](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and an ORM like [Drizzle](https://docs.deno.com/examples/drizzle_tutorial/) or [Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) - use Qwik's lazy loading capabilities for dinosaur images and components - add client-side state management - self-host your app to [AWS](https://docs.deno.com/runtime/tutorials/aws_lightsail/), [Digital Ocean](https://docs.deno.com/runtime/tutorials/digital_ocean/), and [Google Cloud Run](https://docs.deno.com/runtime/tutorials/google_cloud_run/) --- # Build a React app with a starter template > Complete guide to building React applications with Deno and Vite. Learn how to set up a project from a template, implement routing, add API endpoints, and deploy your full-stack TypeScript application. URL: https://docs.deno.com/examples/tutorials/react [React](https://reactjs.org) is the most widely used JavaScript frontend library. In this tutorial we'll build a simple React app with Deno. The app will display a list of dinosaurs. When you click on one, it'll take you to a dinosaur page with more details. You can see the [finished app repo on GitHub](https://github.com/denoland/tutorial-with-react) ![demo of the app](./images/how-to/react/react-dinosaur-app-demo.gif) This tutorial will use [Vite](https://vitejs.dev/) to serve the app locally. Vite is a build tool and development server for modern web projects. It pairs well with React and Deno, leveraging ES modules and allowing you to import React components directly. ## Starter app We've set up a [starter template for you to use](https://github.com/denoland/react-vite-ts-template). This will set up a basic starter app with React, Vite and a deno.json file for you to configure your project. Visit the GitHub repository at [https://github.com/denoland/react-vite-ts-template](https://github.com/denoland/react-vite-ts-template) and click the "Use this template" button to create a new repository. Once you have created a new repository from the template, clone it to your local machine and navigate to the project directory. ## Clone the repository locally ```sh git clone https://github.com/your-username/your-repo-name.git cd your-repo-name ``` ## Install the dependencies Install the project dependencies by running: ```sh deno install ``` ## Run the dev server Now you can serve your new react app by running: ```sh deno run dev ``` This will start the Vite server, click the output link to localhost to see your app in the browser. ## About the template The template repository you cloned comes with a basic React app. The app uses Vite as a dev server and provides a static file server built with [oak](https://jsr.io/@oak/oak) which will serve the built app when deployed. The React app is in the `client` folder and the backend server is in the `server` folder. The `deno.json` file is used to configure the project and specify the permissions required to run the app, it contains the `tasks` field which defines the tasks that can be run with `deno run`. It has a `dev` task which runs the Vite server and a `build` task which builds the app with Vite, and a `serve` task which runs the backend server to serve the built app. ## Add a backend API We'll build an API into the server provided by the template. This will be where we get our dinosaur data. In the `server` directory of your new project, create an `api` folder. In that folder, create a `data.json`, which will contain the hard coded dinosaur data. Copy and paste [this json file](https://github.com/denoland/tutorial-with-react/blob/main/api/data.json) into the `api/data.json` file. (If you were building a real app, you would probably fetch this data from a database or an external API.) We're going to build out some API routes that return dinosaur information into the server that came with the template, we'll need the [`cors` middleware](https://jsr.io/@tajpouria/cors) to enable [CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS). Use the `deno install` command to add the cors dependency to your project: ```shell deno install jsr:@tajpouria/cors ``` Next, update `server/main.ts` to import the required modules and create a new `Router` instance to define some routes: ```ts title="main.ts" import { Application } from "jsr:@oak/oak/application"; import { Router } from "jsr:@oak/oak/router"; import { oakCors } from "@tajpouria/cors"; import routeStaticFilesFrom from "./util/routeStaticFilesFrom.ts"; import data from "./api/data.json" with { type: "json" }; export const app = new Application(); const router = new Router(); ``` After this, in the same file, we'll define two routes. One at `/api/dinosaurs` to return all the dinosaurs, and `/api/dinosaurs/:dinosaur` to return a specific dinosaur based on the name in the URL: ```ts title="main.ts" router.get("/api/dinosaurs", (context) => { context.response.body = data; }); router.get("/api/dinosaurs/:dinosaur", (context) => { if (!context?.params?.dinosaur) { context.response.body = "No dinosaur name provided."; } const dinosaur = data.find((item) => item.name.toLowerCase() === context.params.dinosaur.toLowerCase() ); context.response.body = dinosaur ?? "No dinosaur found."; }); ``` At the bottom of the same file, attach the routes we just defined to the application. We also must include the the static file server from the template, and finally we'll start the server listening on port 8000: ```ts title="main.ts" app.use(oakCors()); app.use(router.routes()); app.use(router.allowedMethods()); app.use(routeStaticFilesFrom([ `${Deno.cwd()}/client/dist`, `${Deno.cwd()}/client/public`, ])); if (import.meta.main) { console.log("Server listening on port http://localhost:8000"); await app.listen({ port: 8000 }); } ``` You can run the API server with `deno run --allow-env --allow-net --allow-read server/main.ts`. We'll create a task to run this command in the background and update the dev task to run both the React app and the API server. In your `deno.json` file, update the `tasks` field to include the following: ```diff title="deno.json" { "tasks": { + "dev": "deno run -A npm:vite & deno run server:start", "build": "deno run -A npm:vite build", "server:start": "deno run -A --node-modules-dir --watch ./server/main.ts", "serve": "deno run build && deno run server:start" }, + "nodeModulesDir": "auto", ``` If you run `deno run dev` now and visit `localhost:8000/api/dinosaurs`, in your browser you should see a JSON response of all of the dinosaurs. ## Update the entrypoint The entrypoint for the React app is in the `client/src/main.tsx` file. Ours is going to be very basic: ```tsx title="main.tsx" import { StrictMode } from "react"; import { createRoot } from "react-dom/client"; import "./index.css"; import App from "./App.tsx"; createRoot(document.getElementById("root")!).render( , ); ``` ## Add a router The app will have two routes: `/` and `/:dinosaur`. We'll use [`react-router-dom`](https://reactrouter.com/en/main) to build out some routing logic, so we'll need to add the `react-router-dom` dependency to your project. In the project root run: ```shell deno install npm:react-router-dom ``` Update the `/src/App.tsx` file to import and use the [`BrowserRouter`](https://reactrouter.com/en/main/router-components/browser-router) component from `react-router-dom` and define the two routes: ```tsx title="App.tsx" import { BrowserRouter, Route, Routes } from "react-router-dom"; import Index from "./pages/index.tsx"; import Dinosaur from "./pages/Dinosaur.tsx"; import "./App.css"; function App() { return ( } /> } /> ); } export default App; ``` ## Proxy to forward the api requests Vite will be serving the application on port `3000` while our api is running on port `8000`. Therefore, we'll need to set up a proxy to allow the `api/` paths to be reachable by the router. Add a proxy setting to the `vite.config.ts`: ```diff title="vite.config.ts" export default defineConfig({ root: "./client", server: { port: 3000, + proxy: { + "/api": { + target: "http://localhost:8000", + changeOrigin: true, + }, + }, ``` ## Create the pages We'll create two pages: `Index` and `Dinosaur`. The `Index` page will list all the dinosaurs and the `Dinosaur` page will show details of a specific dinosaur. Create a `pages` folder in the `src` directory and inside that create two files: `index.tsx` and `Dinosaur.tsx`. ### Types Both pages will use the `Dino` type to describe the shape of data they're expecting from the API, so let's create a `types.ts` file in the `src` directory: ```ts title="types.ts" export type Dino = { name: string; description: string }; ``` ### index.tsx This page will fetch the list of dinosaurs from the API and render them as links: ```tsx title="index.tsx" import { useEffect, useState } from "react"; import { Link } from "react-router-dom"; import { Dino } from "../types.ts"; export default function Index() { const [dinosaurs, setDinosaurs] = useState([]); useEffect(() => { (async () => { const response = await fetch(`/api/dinosaurs/`); const allDinosaurs = await response.json() as Dino[]; setDinosaurs(allDinosaurs); })(); }, []); return (

Welcome to the Dinosaur app

Click on a dinosaur below to learn more.

{dinosaurs.map((dinosaur: Dino) => { return ( {dinosaur.name} ); })}
); } ``` ### Dinosaur.tsx This page will fetch the details of a specific dinosaur from the API and render it in a paragraph: ```tsx title="Dinosaur.tsx" import { useEffect, useState } from "react"; import { Link, useParams } from "react-router-dom"; import { Dino } from "../types"; export default function Dinosaur() { const { selectedDinosaur } = useParams(); const [dinosaur, setDino] = useState({ name: "", description: "" }); useEffect(() => { (async () => { const resp = await fetch(`/api/dinosaurs/${selectedDinosaur}`); const dino = await resp.json() as Dino; setDino(dino); })(); }, [selectedDinosaur]); return (

{dinosaur.name}

{dinosaur.description}

🠠 Back to all dinosaurs
); } ``` ### Styling the list of dinosaurs Since we are displaying the list of dinosaurs on the main page, let's do some basic formatting. Add the following to the bottom of `src/App.css` to display our list of dinosaurs in an orderly fashion: ```css title="src/App.css" .dinosaur { display: block; } ``` ## Run the app To run the app use the task you set up earlier ```sh deno run dev ``` Navigate to the local Vite server in your browser (`localhost:3000`) and you should see the list of dinosaurs displayed which you can click through to find out about each one. ![demo of the app](./images/how-to/react/react-dinosaur-app-demo.gif) ## Build and deploy The template you cloned comes with a `serve` task that builds the app and serves it with the backend server. Run the following command to build and serve the app: ```sh deno run serve ``` If you visit `localhost:8000` in your browser you should see the app running! You can deploy this app to your favourite cloud provider. We recommend using [Deno Deploy](https://deno.com/deploy) for a simple and easy deployment experience. To deploy to Deno Deploy, visit the [Deno Deploy dashboard](https://dash.deno.com) and create a new project. You can then deploy the app by connecting your GitHub repository and selecting the branch you want to deploy. Give the project a name, and make sure that the `build step` is set to `deno run build` and the `Entrypoint` is `server/main.ts`. Click the `Deploy Project` button and your app will be live! 🦕 Now you can scaffold and develop a React app with Vite and Deno! You’re ready to build blazing-fast web applications. We hope you enjoy exploring these cutting-edge tools, we can't wait to see what you make! --- # How to use Redis with Deno > Step-by-step guide to using Redis with Deno. Learn how to set up caching, implement message brokers, handle data streaming, and optimize your applications with Redis's in-memory data store. URL: https://docs.deno.com/examples/tutorials/redis [Redis](https://redis.io/) is an in-memory data store you can use for caching, as a message broker, or for streaming data. [View source here.](https://github.com/denoland/examples/tree/main/with-redis) Here we're going to set up Redis to cache data from an API call to speed up any subsequent requests for that data. We're going to: - Set up a Redis client to save data from every API call in memory - Set up a Deno server so we can easily request certain data - Call the Github API within the server handler to get the data on first request - Serve data from Redis on every subsequent request We can do this within a single file, `main.ts`. ## Connecting to a Redis client We need two modules. The first is the Deno server. We'll use this to get the information from the user to query our API. The second is Redis. We can grab the node package for Redis using the `npm:` modifier: ```tsx import { createClient } from "npm:redis@^4.5"; ``` We create a Redis client using `createClient` and connect to our local Redis server: ```tsx // make a connection to the local instance of redis const client = createClient({ url: "redis://localhost:6379", }); await client.connect(); ``` You can also set host, user, password, and port individually in this [configuration](https://github.com/redis/node-redis/blob/master/docs/client-configuration.md) object. ## Setting up the server Our server is going to act as a wrapper around the Github API. A client can call our server with a Github username in the URL pathname, such as `http://localhost:3000/{username}`. Parsing out the pathname and calling the Github API will take place inside a handler function in our server. We strip the leading slash so we are left with a variable we can pass to the Github API as a username. We'll then pass the response back to the user. ```tsx Deno.serve({ port: 3000 }, async (req) => { const { pathname } = new URL(req.url); // strip the leading slash const username = pathname.substring(1); const resp = await fetch(`https://api.github.com/users/${username}`); const user = await resp.json(); return new Response(JSON.stringify(user), { headers: { "content-type": "application/json", }, }); }); ``` We'll run this with: ```tsx deno run --allow-net main.ts ``` If we then go to [http://localhost:3000/ry](http://localhost:3000/ry) in Postman, we'll get the Github response: ![uncached-redis-body.png](./images/how-to/redis/uncached-redis-body.png) Let's cache this response using Redis. ## Checking the cache Once we have our response from the Github API, we can cache this within Redis using `client.set`, with our username as the key and the user object as the value: ```tsx await client.set(username, JSON.stringify(user)); ``` Next time we request the same username, we can use `client.get` to get the cached user: ```tsx const cached_user = await client.get(username); ``` This returns null if the key doesn't exist. So we can use it in some flow control. When we get the username, we'll initially check whether we already have that user in the cache. If we do we'll serve the cached result. If not, we'll call the Github API to get the user, cache it, the serve the API result. In both cases, we'll add a custom header to show which version we're serving: ```tsx const server = new Server({ handler: async (req) => { const { pathname } = new URL(req.url); // strip the leading slash const username = pathname.substring(1); const cached_user = await client.get(username); if (cached_user) { return new Response(cached_user, { headers: { "content-type": "application/json", "is-cached": "true", }, }); } else { const resp = await fetch(`https://api.github.com/users/${username}`); const user = await resp.json(); await client.set(username, JSON.stringify(user)); return new Response(JSON.stringify(user), { headers: { "content-type": "application/json", "is-cached": "false", }, }); } }, port: 3000, }); server.listenAndServe(); ``` Running this first time gives us the same response as above, and we'll see the `is-cached` header set to `false`: ![uncached-redis-header.png](./images/how-to/redis/uncached-redis-header.png) But call with the same username again, and we get the cached result. The body is identical: ![cached-redis-body.png](./images/how-to/redis/cached-redis-body.png) But the header shows we have the cache: ![cached-redis-header.png](./images/how-to/redis/cached-redis-header.png) We can also see that the response was ~200ms quicker! You can check out the Redis documentation [here](https://redis.io/docs/) and the Redis node package [here](https://github.com/redis/node-redis). --- # Run a script > A guide to creating and running basic scripts with Deno. Learn how to write and execute JavaScript and TypeScript code, understand runtime environments, and get started with fundamental Deno concepts. URL: https://docs.deno.com/examples/tutorials/run_script Deno is a secure runtime for JavaScript and TypeScript. A runtime is the environment where your code executes. It provides the necessary infrastructure for your programs to run, handling things like memory management, I/O operations, and interaction with external resources. The runtime is responsible for translating your high-level code (JavaScript or TypeScript) into machine instructions that the computer can understand. When you run JavaScript in a web browser (like Chrome, Firefox, or Edge), you’re using a browser runtime. Browser runtimes are tightly coupled with the browser itself. They provide APIs for manipulating the Document Object Model (DOM), handling events, making network requests, and more. These runtimes are sandboxed, they operate within the browser’s security model. They can’t access resources outside the browser, such as the file system or environment variables. When you run your code with Deno, you’re executing your JavaScript or TypeScript code directly on your machine, outside the browser context. Therefore, Deno programs can access resources on the host computer, such as the file system, environment variables, and network sockets. Deno provides a seamless experience for running JavaScript and TypeScript code. Whether you prefer the dynamic nature of JavaScript or the type safety of TypeScript, Deno has you covered. ## Running a script In this tutorial we'll create a simple "Hello World" example in both JavaScript and TypeScript using Deno. We'll define a `capitalize` function that capitalizes the first letter of a word. Then, we define a `hello` function that returns a greeting message with the capitalized name. Finally, we call the `hello` function with different names and print the output to the console. ### JavaScript First, create a `hello-world.js` file and add the following code: ```js title="hello-world.js" function capitalize(word) { return word.charAt(0).toUpperCase() + word.slice(1); } function hello(name) { return "Hello " + capitalize(name); } console.log(hello("john")); console.log(hello("Sarah")); console.log(hello("kai")); ``` Run the script using the `deno run` command: ```sh $ deno run hello-world.js Hello John Hello Sarah Hello Kai ``` ### TypeScript This TypeScript example is exactly the same as the JavaScript example above, the code just has the additional type information which TypeScript supports. Create a `hello-world.ts` file and add the following code: ```ts title="hello-world.ts" function capitalize(word: string): string { return word.charAt(0).toUpperCase() + word.slice(1); } function hello(name: string): string { return "Hello " + capitalize(name); } console.log(hello("john")); console.log(hello("Sarah")); console.log(hello("kai")); ``` Run the TypeScript script using the `deno run` command: ```sh $ deno run hello-world.ts Hello John Hello Sarah Hello Kai ``` 🦕 Congratulations! Now you know how to create a simple script in both JS and TS and how to run it in Deno with the `deno run` command. Keep exploring the tutorials and examples to learn more about Deno! --- # Snapshot testing > Learn how to use snapshot testing in Deno to compare outputs against recorded references, making it easier to detect unintended changes in your code URL: https://docs.deno.com/examples/tutorials/snapshot Snapshot testing is a testing technique that captures the output of your code and compares it against a stored reference version. Rather than manually writing assertions for each property, you let the test runner record the entire output structure, making it easier to detect any unexpected changes. The [Deno Standard Library](/runtime/fundamentals/standard_library/) has a [snapshot module](https://jsr.io/@std/testing/doc/snapshot), which enables developers to write tests which assert a value against a reference snapshot. This reference snapshot is a serialized representation of the original value and is stored alongside the test file. ## Basic usage The `assertSnapshot` function will create a snapshot of a value and compare it to a reference snapshot, which is stored alongside the test file in the `__snapshots__` directory. To create an initial snapshot (or to update an existing snapshot), use the `-- --update` flag with the `deno test` command. ### Basic snapshot example The below example shows how to use the snapshot library with the `Deno.test` API. We can test a snapshot of a basic object, containing string and number properties. The `assertSnapshot(t, a)` function compares the object against a stored snapshot. The `t` parameter is the test context that Deno provides, which the snapshot function uses to determine the test name and location for storing snapshots. ```ts title="example_test.ts" import { assertSnapshot } from "jsr:@std/testing/snapshot"; Deno.test("isSnapshotMatch", async (t) => { const a = { hello: "world!", example: 123, }; await assertSnapshot(t, a); }); ``` You will need to grant read and write file permissions in order for Deno to write a snapshot file and then read it to test the assertion. If it is the first time you are running the test a do not already have a snapshot, add the `--update` flag: ```bash deno test --allow-read --allow-write -- --update ``` If you already have a snapshot file, you can run the test with: ```bash deno test --allow-read ``` The test will compare the current output of the object against the stored snapshot. If they match, the test passes; if they differ, the test fails. The snapshot file will look like this: ```ts title="__snapshots__/example_test.ts.snap" export const snapshot = {}; snapshot[`isSnapshotMatch 1`] = ` { example: 123, hello: "world!", } `; ``` You can edit your test to change the `hello` string to `"everyone!"` and run the test again with `deno test --allow-read`. This time the `assertSnapshot` function will throw an `AssertionError`, causing the test to fail because the snapshot created during the test does not match the one in the snapshot file. ## Updating snapshots When adding new snapshot assertions to your test suite, or when intentionally making changes which cause your snapshots to fail, you can update your snapshots by running the snapshot tests in update mode. Tests can be run in update mode by passing the `--update` or `-u` flag as an argument when running the test. When this flag is passed, then any snapshots which do not match will be updated. ```bash deno test --allow-read --allow-write -- --update ``` :::note New snapshots will only be created when the `--update` flag is present. ::: ## Permissions When running snapshot tests, the `--allow-read` permission must be enabled, or else any calls to `assertSnapshot` will fail due to insufficient permissions. Additionally, when updating snapshots, the `--allow-write` permission must be enabled, as this is required in order to update snapshot files. The assertSnapshot function will only attempt to read from and write to snapshot files. As such, the allow list for `--allow-read` and `--allow-write` can be limited to only include existing snapshot files, if desired. ## Version Control Snapshot testing works best when changes to snapshot files are committed alongside other code changes. This allows for changes to reference snapshots to be reviewed along side the code changes that caused them, and ensures that when others pull your changes, their tests will pass without needing to update snapshots locally. ## Options The `assertSnapshot` function can be called with an `options` object which offers greater flexibility and enables some non standard use cases: ```ts import { assertSnapshot } from "jsr:@std/testing/snapshot"; Deno.test("isSnapshotMatch", async (t) => { const a = { hello: "world!", example: 123, }; await assertSnapshot(t, a, {/*custom options go here*/}); }); ``` ### serializer When you run a test with `assertSnapshot`, the data you're testing needs to be converted to a string format that can be written to the snapshot file (when creating or updating snapshots) and compared with the existing snapshot (when validating), this is called serialization. The `serializer` option allows you to provide a custom serializer function. This custom function will be called by `assertSnapshot` and be passed the value being asserted. Your custom function must: 1. Return a `string` 2. Be deterministic, (it will always produce the same output, given the same input). The code below shows a practical example of creating and using a custom serializer function for snapshot testing. This serialiser removes any ANSI colour codes from a string using the [`stripColour`](https://jsr.io/@std/fmt/doc/colors) string formatter from the Deno Standard Library. ```ts title="example_test.ts" import { assertSnapshot, serialize } from "jsr:@std/testing/snapshot"; import { stripColor } from "jsr:@std/fmt/colors"; /** * Serializes `actual` and removes ANSI escape codes. */ function customSerializer(actual: string) { return serialize(stripColor(actual)); } Deno.test("Custom Serializer", async (t) => { const output = "\x1b[34mHello World!\x1b[39m"; await assertSnapshot(t, output, { serializer: customSerializer, }); }); ``` ```ts title="__snapshots__/example_test.ts.snap" snapshot = {}; snapshot[`Custom Serializer 1`] = `"Hello World!"`; ``` Custom serializers can be useful in a variety of scenarios: - To remove irrelevant formatting (like ANSI codes shown above) and improve legibility - To handle non-deterministic data. Timestamps, UUIDs, or random values can be replaced with placeholders - To mask or remove sensitive data that shouldn't be saved in snapshots - Custom formatting to present complex objects in a domain-specific format ### Serialization with `Deno.customInspect` Because the default serializer uses `Deno.inspect` under the hood, you can set the property `Symbol.for("Deno.customInspect")` to a custom serialization function if desired: ```ts title="example_test.ts" // example_test.ts import { assertSnapshot } from "jsr:@std/testing/snapshot"; class HTMLTag { constructor( public name: string, public children: Array = [], ) {} public render(depth: number) { const indent = " ".repeat(depth); let output = `${indent}<${this.name}>\n`; for (const child of this.children) { if (child instanceof HTMLTag) { output += `${child.render(depth + 1)}\n`; } else { output += `${indent} ${child}\n`; } } output += `${indent}`; return output; } public [Symbol.for("Deno.customInspect")]() { return this.render(0); } } Deno.test("Page HTML Tree", async (t) => { const page = new HTMLTag("html", [ new HTMLTag("head", [ new HTMLTag("title", [ "Simple SSR Example", ]), ]), new HTMLTag("body", [ new HTMLTag("h1", [ "Simple SSR Example", ]), new HTMLTag("p", [ "This is an example of how Deno.customInspect could be used to snapshot an intermediate SSR representation", ]), ]), ]); await assertSnapshot(t, page); }); ``` This test will produce the following snapshot. ```ts title="__snapshots__/example_test.ts.snap" export const snapshot = {}; snapshot[`Page HTML Tree 1`] = ` Simple SSR Example

Simple SSR Example

This is an example of how Deno.customInspect could be used to snapshot an intermediate SSR representation

`; ``` In contrast, when we remove the `Deno.customInspect` method, the test will produce the following snapshot: ```ts title="__snapshots__/example_test.ts.snap" export const snapshot = {}; snapshot[`Page HTML Tree 1`] = `HTMLTag { children: [ HTMLTag { children: [ HTMLTag { children: [ "Simple SSR Example", ], name: "title", }, ], name: "head", }, HTMLTag { children: [ HTMLTag { children: [ "Simple SSR Example", ], name: "h1", }, HTMLTag { children: [ "This is an example of how Deno.customInspect could be used to snapshot an intermediate SSR representation", ], name: "p", }, ], name: "body", }, ], name: "html", }`; ``` You can see that this second snapshot is much less readable. This is because: 1. The keys are sorted alphabetically, so the name of the element is displayed after its children 2. It includes a lot of extra information, causing the snapshot to be more than twice as long 3. It is not an accurate serialization of the HTML which the data represents Note that in this example it would be possible to achieve the same result by calling: ```ts await assertSnapshot(t, page.render(0)); ``` However, depending on the public API you choose to expose, this may not be practical. It is also worth considering that this could have an impact beyond your snapshot testing. For example, `Deno.customInspect` is also used to serialize objects when calling `console.log` (and in some other cases). This may or may not be desirable. ### `dir` and `path` The `dir` and `path` options allow you to control where the snapshot file will be saved to and read from. These can be absolute paths or relative paths. If relative, they will be resolved relative to the test file. For example, if your test file is located at `/path/to/test.ts` and the `dir` option is set to `snapshots`, then the snapshot file would be written to `/path/to/snapshots/test.ts.snap`. - `dir` allows you to specify the snapshot directory, while still using the default format for the snapshot file name. - `path` allows you to specify the directory and file name of the snapshot file. If your test file is located at `/path/to/test.ts` and the `path` option is set to `snapshots/test.snapshot`, then the snapshot file would be written to `/path/to/snapshots/test.snapshot`. :::note If both `dir` and `path` are specified, the `dir` option will be ignored and the `path` option will be handled as normal. ::: ### `mode` The `mode` option controls how `assertSnapshot` behaves regardless of command line flags and has two settings, `assert` or `update`: - `assert`: Always performs comparison only, ignoring any `--update` or `-u` flags. If snapshots don't match, the test will fail with an `AssertionError`. - `update`: Always updates snapshots. Any mismatched snapshots will be updated after tests complete. This option is useful when you need different snapshot behaviors within the same test suite: ```ts // Create a new snapshot or verify an existing one await assertSnapshot(t, stableComponent); // Always update this snapshot regardless of command line flags await assertSnapshot(t, experimentalComponent, { mode: "update", name: "experimental feature", }); // Always verify but never update this snapshot regardless of command line flags await assertSnapshot(t, criticalComponent, { mode: "assert", name: "critical feature", }); ``` ### `name` The name of the snapshot. If unspecified, the name of the test step will be used instead. ```ts title="example_test.ts" import { assertSnapshot } from "jsr:@std/testing/snapshot"; Deno.test("isSnapshotMatch", async (t) => { const a = { hello: "world!", example: 123, }; await assertSnapshot(t, a, { name: "Test Name", }); }); ``` ```ts title="__snapshots__/example_test.ts.snap" export const snapshot = {}; snapshot[`Test Name 1`] = ` { example: 123, hello: "world!", } `; ``` When `assertSnapshot` is run multiple times with the same value for name, then the suffix will be incremented as normal. i.e. `Test Name 1`, `Test Name 2`, `Test Name 3`, etc. ### `msg` Used to set a custom error message. This will overwrite the default error message, which includes the diff for failed snapshots: ```ts Deno.test("custom error message example", async (t) => { const userData = { name: "John Doe", role: "admin", }; await assertSnapshot(t, userData, { msg: "User data structure has changed unexpectedly. Please verify your changes are intentional.", }); }); ``` When the snapshot fails, instead of seeing the default diff message, you'll see your custom error message. ## Testing Different Data Types Snapshot testing works with various data types and structures: ```ts Deno.test("snapshot various types", async (t) => { // Arrays await assertSnapshot(t, [1, 2, 3, "four", { five: true }]); // Complex objects await assertSnapshot(t, { user: { name: "Test", roles: ["admin", "user"] }, settings: new Map([["theme", "dark"], ["language", "en"]]), }); // Error objects await assertSnapshot(t, new Error("Test error message")); }); ``` ## Working with Asynchronous Code When testing asynchronous functions, ensure you await the results before passing them to the snapshot: ```ts Deno.test("async function test", async (t) => { const fetchData = async () => { // Simulate API call return { success: true, data: ["item1", "item2"] }; }; const result = await fetchData(); await assertSnapshot(t, result); }); ``` ## Best Practices ### Keep Snapshots Concise Avoid capturing large data structures that aren't necessary for your test. Focus on capturing only what's relevant. ### Descriptive Test Names Use descriptive test names that clearly indicate what's being tested: ```ts Deno.test( "renders user profile card with all required fields", async (t) => { // ... test code await assertSnapshot(t, component); }, ); ``` ### Review Snapshots During Code Reviews Always review snapshot changes during code reviews to ensure they represent intentional changes and not regressions. ### Snapshot Organization For larger projects, consider organizing snapshots by feature or component: ```ts await assertSnapshot(t, component, { path: `__snapshots__/components/${componentName}.snap`, }); ``` ## Snapshot Testing in CI/CD ### GitHub Actions Example When running snapshot tests in CI environments, you'll typically want to verify existing snapshots rather than updating them: ```yaml title=".github/workflows/test.yml" name: Test on: [push, pull_request] jobs: test: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: denoland/setup-deno@v2 with: deno-version: v2.x - name: Run tests run: deno test --allow-read ``` For pull requests that intentionally update snapshots, reviewers should verify the changes are expected before merging. ## Practical Examples ### Testing HTML Output HTML output testing with snapshots is particularly useful for web applications where you want to ensure your components render the expected markup. This approach allows you to catch unintended changes in your HTML structure, attributes, or content that might affect the visual appearance or functionality of your UI components. By capturing a snapshot of the HTML output, you can: - Verify that UI components render correctly with different props/data - Detect regressions when refactoring rendering logic - Document the expected output format of components ```ts Deno.test("HTML rendering", async (t) => { const renderComponent = () => { return `

User Profile

Username: testuser

`; }; await assertSnapshot(t, renderComponent()); }); ``` ### Testing API Responses When building applications that interact with APIs, snapshot testing helps ensure that the structure and format of API responses remain consistent. This is particularly valuable for: - Maintaining backward compatibility when updating API integrations - Verifying that your API response parsing logic works correctly - Documenting the expected shape of API responses for team collaboration - Detecting unexpected changes in API responses that could break your application ```ts Deno.test("API response format", async (t) => { const mockApiResponse = { status: 200, data: { users: [ { id: 1, name: "User 1" }, { id: 2, name: "User 2" }, ], pagination: { page: 1, total: 10 }, }, }; await assertSnapshot(t, mockApiResponse); }); ``` 🦕 Snapshot testing is a powerful technique that complements traditional unit tests by allowing you to capture and verify complex outputs without writing detailed assertions. By incorporating snapshot tests into your testing strategy, you can catch unintended changes, document expected behavior, and build more resilient applications. --- # Build a SolidJS app with Deno > Build a SolidJS application with Deno. Learn how to set up a project, implement reactive components, handle routing, create API endpoints with Hono, and build a full-stack TypeScript application. URL: https://docs.deno.com/examples/tutorials/solidjs [SolidJS](https://www.solidjs.com/) is a declarative JavaScript library for creating user interfaces that emphasizes fine-grained reactivity and minimal overhead. When combined with Deno's modern runtime environment, you get a powerful, performant stack for building web applications. In this tutorial, we'll build a simple dinosaur catalog app that demonstrates the key features of both technologies. We'll go over how to build a simple SolidJS app using Deno: - [Scaffold a SolidJS app](#scaffold-a-solidjs-app-with-vite) - [Set up on Hono backend](#set-up-our-hono-backend) - [Create our SolidJS frontend](#create-our-solidjs-frontend) - [Next steps](#next-steps) Feel free to skip directly to [the source code](https://github.com/denoland/examples/tree/main/with-solidjs) or follow along below! ## Scaffold a SolidJS app with Vite Let's set up our SolidJS application using [Vite](https://vite.dev/), a modern build tool that provides an excellent development experience with features like hot module replacement and optimized builds. ```bash deno init --npm vite@latest solid-deno --template solid-ts ``` Our backend will be powered by [Hono](https://hono.dev/), which we can install via [JSR](https://jsr.io). Let's also add `solidjs/router` for client-side routing and navigation between our dinosaur catalog pages.
```bash deno add jsr:@hono/hono npm:@solidjs/router ```
Learn more about deno add and using Deno as a package manager.
We'll also have to update our `deno.json` to include a few tasks and `compilerOptions` to run our app:
```json { "tasks": { "dev": "deno task dev:api & deno task dev:vite", "dev:api": "deno run --allow-env --allow-net --allow-read api/main.ts", "dev:vite": "deno run -A npm:vite", "build": "deno run -A npm:vite build", "serve": { "command": "deno task dev:api", "description": "Run the build, and then start the API server", "dependencies": ["deno task build"] } }, "imports": { "@hono/hono": "jsr:@hono/hono@^4.6.12", "@solidjs/router": "npm:@solidjs/router@^0.14.10" }, "compilerOptions": { "jsx": "react-jsx", "jsxImportSource": "solid-js", "lib": ["DOM", "DOM.Iterable", "ESNext"] } } ```
You can write your tasks as objects. Here our serve command includes a description and dependencies.
Great! Next, let's setup our API backend. ## Set up our Hono backend Within our main directory, we will set up an `api/` directory and create two files. First, our dinosaur data file, [`api/data.json`](https://github.com/denoland/examples/blob/main/with-solidjs/api/data.json): ```jsonc // api/data.json [ { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." }, { "name": "Abelisaurus", "description": "\"Abel's lizard\" has been reconstructed from a single skull." }, { "name": "Abrictosaurus", "description": "An early relative of Heterodontosaurus." }, ... ] ``` This is where our data will be pulled from. In a full application, this data would come from a database. > ⚠️️ In this tutorial we hard code the data. But you can connect > to [a variety of databases](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and [even use ORMs like Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) with > Deno. Secondly, we need our Hono server, `api/main.ts`: ```tsx // api/main.ts import { Hono } from "@hono/hono"; import data from "./data.json" with { type: "json" }; const app = new Hono(); app.get("/", (c) => { return c.text("Welcome to the dinosaur API!"); }); app.get("/api/dinosaurs", (c) => { return c.json(data); }); app.get("/api/dinosaurs/:dinosaur", (c) => { if (!c.req.param("dinosaur")) { return c.text("No dinosaur name provided."); } const dinosaur = data.find((item) => item.name.toLowerCase() === c.req.param("dinosaur").toLowerCase() ); console.log(dinosaur); if (dinosaur) { return c.json(dinosaur); } else { return c.notFound(); } }); Deno.serve(app.fetch); ``` This Hono server provides two API endpoints: - `GET /api/dinosaurs` to fetch all dinosaurs, and - `GET /api/dinosaurs/:dinosaur` to fetch a specific dinosaur by name This server will be started on `localhost:8000` when we run `deno task dev`. Finally, before we start building out the frontend, let's update our `vite.config.ts` file with the below, especially the `server.proxy`, which informs our SolidJS frontend where to locate the API endpoint. ```tsx // vite.config.ts import { defineConfig } from "vite"; import solid from "vite-plugin-solid"; export default defineConfig({ plugins: [solid()], server: { proxy: { "/api": { target: "http://localhost:8000", changeOrigin: true, }, }, }, }); ``` ## Create our SolidJS frontend Before we begin building out the frontend components, let's quickly define the `Dino` type in `src/types.ts`: ```tsx // src/types.ts export type Dino = { name: string; description: string; }; ``` The `Dino` type interface ensures type safety throughout our application, defining the shape of our dinosaur data and enabling TypeScript's static type checking. Next, let's set up our frontend to receive that data. We're going to have two pages: - `Index.tsx` - `Dinosaur.tsx` Here's the code for the `src/pages/Index.tsx` page: ```tsx // src/pages/Index.tsx import { createSignal, For, onMount } from "solid-js"; import { A } from "@solidjs/router"; import type { Dino } from "../types.ts"; export default function Index() { const [dinosaurs, setDinosaurs] = createSignal([]); onMount(async () => { try { const response = await fetch("/api/dinosaurs"); const allDinosaurs = (await response.json()) as Dino[]; setDinosaurs(allDinosaurs); console.log("Fetched dinosaurs:", allDinosaurs); } catch (error) { console.error("Failed to fetch dinosaurs:", error); } }); return (

Welcome to the Dinosaur app

Click on a dinosaur below to learn more.

{(dinosaur) => ( {dinosaur.name} )}
); } ``` When using SolidJS, there are a few key differences to React to be aware of: 1. We use SolidJS-specific primitives: - `createSignal` instead of `useState` - `createEffect` instead of `useEffect` - `For` component instead of `map` - `A` component instead of `Link` 2. SolidJS components use fine-grained reactivity, so we call signals as functions, e.g. `dinosaur()` instead of just `dinosaur` 3. The routing is handled by `@solidjs/router` instead of `react-router-dom` 4. Component imports use Solid's [`lazy`](https://docs.solidjs.com/reference/component-apis/lazy) for code splitting The `Index` page uses SolidJS's `createSignal` to manage the list of dinosaurs and `onMount` to fetch the data when the component loads. We use the `For` component, which is SolidJS's efficient way of rendering lists, rather than using JavaScript's map function. The `A` component from `@solidjs/router` creates client-side navigation links to individual dinosaur pages, preventing full page reloads. Now the individual dinosaur data page at `src/pages/Dinosaur.tsx`: ```tsx // src/pages/Dinosaur.tsx import { createSignal, onMount } from "solid-js"; import { A, useParams } from "@solidjs/router"; import type { Dino } from "../types.ts"; export default function Dinosaur() { const params = useParams(); const [dinosaur, setDinosaur] = createSignal({ name: "", description: "", }); onMount(async () => { const resp = await fetch(`/api/dinosaurs/${params.selectedDinosaur}`); const dino = (await resp.json()) as Dino; setDinosaur(dino); console.log("Dinosaur", dino); }); return (

{dinosaur().name}

{dinosaur().description}

Back to all dinosaurs
); } ``` The `Dinosaur` page demonstrates SolidJS's approach to dynamic routing by using `useParams` to access the URL parameters. It follows a similar pattern to the `Index` page, using `createSignal` for state management and `onMount` for data fetching, but focuses on a single dinosaur's details. This `Dinosaur` component also shows how to access signal values in the template by calling them as functions (e.g., `dinosaur().name`), which is a key difference from React's state management. Finally, to tie it all together, we'll update the `App.tsx` file, which will serve both the `Index` and `Dinosaur` pages as components. The `App` component sets up our routing configuration using `@solidjs/router`, defining two main routes: the index route for our dinosaur list and a dynamic route for individual dinosaur pages. The `:selectedDinosaur` parameter in the route path creates a dynamic segment that matches any dinosaur name in the URL. ```tsx // src/App.tsx import { Route, Router } from "@solidjs/router"; import Index from "./pages/Index.tsx"; import Dinosaur from "./pages/Dinosaur.tsx"; import "./App.css"; const App = () => { return ( ); }; export default App; ``` Finally, this `App` component will be called from our main index: ```tsx // src/index.tsx import { render } from "solid-js/web"; import App from "./App.tsx"; import "./index.css"; const wrapper = document.getElementById("root"); if (!wrapper) { throw new Error("Wrapper div not found"); } render(() => , wrapper); ``` The entry point of our application mounts the App component to the DOM using SolidJS's `render` function. It includes a safety check to ensure the root element exists before attempting to render, providing better error handling during initialization. Now, let's run `deno task dev` to start both the frontend and backend together:
## Next steps 🦕 Now you can build and run a SolidJS app with Deno! Here are some ways you could enhance your dinosaur application: - Add persistent data store [using a database like Postgres or MongoDB](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and an ORM like [Drizzle](https://deno.com/blog/build-database-app-drizzle) or [Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) - Implement global state using SolidJS's [`createContext`](https://docs.solidjs.com/reference/component-apis/create-context) for sharing data between components - Add loading states using [`createResource`](https://docs.solidjs.com/reference/basic-reactivity/create-resource)'s loading property - Implement route-based code splitting with [`lazy`](https://docs.solidjs.com/reference/component-apis/lazy) imports - Use `Index` component for more efficient list rendering - Deploy your app to [AWS](https://docs.deno.com/runtime/tutorials/aws_lightsail/), [Digital Ocean](https://docs.deno.com/runtime/tutorials/digital_ocean/), or [Google Cloud Run](https://docs.deno.com/runtime/tutorials/google_cloud_run/) The combination of SolidJS's unique reactive primitives, true DOM reconciliation, and Deno's modern runtime provides an incredibly efficient foundation for web development. With no Virtual DOM overhead and granular updates only where needed, your application can achieve optimal performance while maintaining clean, readable code. --- # Stubbing in tests > Learn how to use stubs in Deno to isolate code during testing by replacing function implementations with controlled behavior URL: https://docs.deno.com/examples/tutorials/stubbing Stubbing is a powerful technique for isolating the code you're testing by replacing functions with controlled implementations. While [spies](/examples/mocking_tutorial/#spying) monitor function calls without changing behavior, stubs go a step further by completely replacing the original implementation, allowing you to simulate specific conditions or behaviors during testing. ## What are stubs? Stubs are fake implementations that replace real functions during testing. They let you: - Control what values functions return - Simulate errors or specific edge cases - Prevent external services like databases or APIs from being called - Test code paths that would be difficult to trigger with real implementations Deno provides robust stubbing capabilities through the [Standard Library's testing tools](https://jsr.io/@std/testing/doc/mock#stubbing). ## Basic stub usage Here's a simple example demonstrating how to stub a function: ```ts import { assertEquals } from "jsr:@std/assert"; import { stub } from "jsr:@std/testing/mock"; // Original function function getUserName(id: number): string { // In a real app, this might call a database return "Original User"; } // Function under test function greetUser(id: number): string { const name = getUserName(id); return `Hello, ${name}!`; } Deno.test("greetUser with stubbed getUserName", () => { // Create a stub that returns a controlled value const getUserNameStub = stub(globalThis, "getUserName", () => "Test User"); try { // Test with the stubbed function const greeting = greetUser(123); assertEquals(greeting, "Hello, Test User!"); } finally { // Always restore the original function getUserNameStub.restore(); } }); ``` In this example, we: 1. Import the necessary functions from Deno's standard library 2. Create a stub for the `getUserName` function that returns "Test User" instead of calling the real implementation 3. Call our function under test, which will use the stubbed implementation 4. Verify the result meets our expectations 5. Restore the original function to prevent affecting other tests ## Using stubs in a testing scenario Let's look at a more practical example with a `UserRepository` class that interacts with a database: ```ts import { assertSpyCalls, returnsNext, stub } from "jsr:@std/testing/mock"; import { assertThrows } from "jsr:@std/assert"; type User = { id: number; name: string; }; // This represents our database access layer const database = { getUserById(id: number): User | undefined { // In a real app, this would query a database return { id, name: "Ada Lovelace" }; }, }; // The class we want to test class UserRepository { static findOrThrow(id: number): User { const user = database.getUserById(id); if (!user) { throw new Error("User not found"); } return user; } } Deno.test("findOrThrow method throws when the user was not found", () => { // Stub the database.getUserById function to return undefined using dbStub = stub(database, "getUserById", returnsNext([undefined])); // We expect this function call to throw an error assertThrows(() => UserRepository.findOrThrow(1), Error, "User not found"); // Verify the stubbed function was called once assertSpyCalls(dbStub, 1); }); ``` In this example: 1. We're testing the `findOrThrow` method, which should throw an error when a user is not found 2. We stub `database.getUserById` to return `undefined`, simulating a missing user 3. We verify that `findOrThrow` throws the expected error 4. We also check that the database method was called exactly once Note that we're using the `using` keyword with `stub`, which is a convenient way to ensure the stub is automatically restored when it goes out of scope. ## Advanced stub techniques ### Returning different values on subsequent calls Sometimes you want a stub to return different values each time it's called: ```ts import { returnsNext, stub } from "jsr:@std/testing/mock"; import { assertEquals } from "jsr:@std/assert"; Deno.test("stub with multiple return values", () => { const fetchDataStub = stub( globalThis, "fetchData", // Return these values in sequence returnsNext(["first result", "second result", "third result"]), ); try { assertEquals(fetchData(), "first result"); assertEquals(fetchData(), "second result"); assertEquals(fetchData(), "third result"); } finally { fetchDataStub.restore(); } }); ``` ### Stubbing with implementation logic You can also provide custom logic in your stub implementations: ```ts import { stub } from "jsr:@std/testing/mock"; import { assertEquals } from "jsr:@std/assert"; Deno.test("stub with custom implementation", () => { // Create a counter to track how many times the stub is called let callCount = 0; const calculateStub = stub( globalThis, "calculate", (a: number, b: number) => { callCount++; return a + b * 2; // Custom implementation }, ); try { const result = calculate(5, 10); assertEquals(result, 25); // 5 + (10 * 2) assertEquals(callCount, 1); } finally { calculateStub.restore(); } }); ``` ## Stubbing API calls and external services One of the most common uses of stubs is to replace API calls during testing: ```ts import { assertEquals } from "jsr:@std/assert"; import { stub } from "jsr:@std/testing/mock"; async function fetchUserData(id: string) { const response = await fetch(`https://api.example.com/users/${id}`); if (!response.ok) { throw new Error(`Failed to fetch user: ${response.status}`); } return await response.json(); } Deno.test("fetchUserData with stubbed fetch", async () => { const mockResponse = new Response( JSON.stringify({ id: "123", name: "Jane Doe" }), { status: 200, headers: { "Content-Type": "application/json" } }, ); // Replace global fetch with a stubbed version const fetchStub = stub( globalThis, "fetch", () => Promise.resolve(mockResponse), ); try { const user = await fetchUserData("123"); assertEquals(user, { id: "123", name: "Jane Doe" }); } finally { fetchStub.restore(); } }); ``` ## Best practices 1. **Always restore stubs**: Use `try/finally` blocks or the `using` keyword to ensure stubs are restored, even if tests fail. 2. **Use stubs for external dependencies**: Stub out database calls, API requests, or file system operations to make tests faster and more reliable. 3. **Keep stubs simple**: Stubs should return predictable values that let you test specific scenarios. 4. **Combine with spies when needed**: Sometimes you need to both replace functionality (stub) and track calls (spy). 5. **Stub at the right level**: Stub at the interface boundary rather than deep within implementation details. 🦕 Stubs are a powerful tool for isolating your code during testing, allowing you to create deterministic test environments and easily test edge cases. By replacing real implementations with controlled behavior, you can write more focused, reliable tests that run quickly and consistently. For more testing resources, check out: - [Testing in isolation with mocks](/examples/mocking_tutorial/) - [Deno Standard Library Testing Modules](https://jsr.io/@std/testing) - [Basic Testing in Deno](/examples/testing_tutorial/) --- # Creating a subprocess > A guide to working with subprocesses in Deno. Learn how to spawn processes, handle input/output streams, manage process lifecycles, and implement inter-process communication patterns safely. URL: https://docs.deno.com/examples/tutorials/subprocess ## Concepts - Deno is capable of spawning a subprocess via [Deno.Command](https://docs.deno.com/api/deno/~/Deno.Command). - `--allow-run` permission is required to spawn a subprocess. - Spawned subprocesses do not run in a security sandbox. - Communicate with the subprocess via the [stdin](https://docs.deno.com/api/deno/~/Deno.stdin), [stdout](https://docs.deno.com/api/deno/~/Deno.stdout) and [stderr](https://docs.deno.com/api/deno/~/Deno.stderr) streams. ## Simple example This example is the equivalent of running `echo "Hello from Deno!"` from the command line. ```ts title="subprocess_simple.ts" // define command used to create the subprocess const command = new Deno.Command("echo", { args: [ "Hello from Deno!", ], }); // create subprocess and collect output const { code, stdout, stderr } = await command.output(); console.assert(code === 0); console.log(new TextDecoder().decode(stdout)); console.log(new TextDecoder().decode(stderr)); ``` Run it: ```shell $ deno run --allow-run=echo ./subprocess_simple.ts Hello from Deno! ``` ## Security The `--allow-run` permission is required for creation of a subprocess. Be aware that subprocesses are not run in a Deno sandbox and therefore have the same permissions as if you were to run the command from the command line yourself. ## Communicating with subprocesses By default when you use `Deno.Command()` the subprocess inherits `stdin`, `stdout` and `stderr` of the parent process. If you want to communicate with a started subprocess you must use the `"piped"` option. ## Piping to files This example is the equivalent of running `yes &> ./process_output` in bash. ```ts title="subprocess_piping_to_files.ts" import { mergeReadableStreams, } from "jsr:@std/streams@1.0.0-rc.4/merge-readable-streams"; // create the file to attach the process to const file = await Deno.open("./process_output.txt", { read: true, write: true, create: true, }); // start the process const command = new Deno.Command("yes", { stdout: "piped", stderr: "piped", }); const process = command.spawn(); // example of combining stdout and stderr while sending to a file const joined = mergeReadableStreams( process.stdout, process.stderr, ); // returns a promise that resolves when the process is killed/closed joined.pipeTo(file.writable).then(() => console.log("pipe join done")); // manually stop process "yes" will never end on its own setTimeout(() => { process.kill(); }, 100); ``` Run it: ```shell $ deno run --allow-run=yes --allow-read=. --allow-write=. ./subprocess_piping_to_file.ts ``` --- # Build an app with Tanstack and Deno > Complete guide to building applications with Tanstack and Deno. Learn how to implement Query for data fetching, Router for navigation, manage server state, and create type-safe full-stack applications. URL: https://docs.deno.com/examples/tutorials/tanstack [Tanstack](https://tanstack.com/) is a set of framework-agnostic data management tools. With Tanstack, developers can manage server state efficiently with [Query](https://tanstack.com/query/latest), create powerful tables with [Table](https://tanstack.com/table/latest), handle complex routing with [Router](https://tanstack.com/router/latest), and build type-safe forms with [Form](https://tanstack.com/form/latest). These tools work seamlessly across [React](/examples/react_tutorial), [Vue](/examples/vue_tutorial), [Solid](/examples/solidjs_tutorial), and other frameworks while maintaining excellent TypeScript support. In this tutorial, we’ll build a simple app using [Tanstack Query](https://tanstack.com/query/latest) and [Tanstack Router](https://tanstack.com/router/latest/docs/framework/react/quick-start). The app will display a list of dinosaurs. When you click on one, it'll take you to a dinosaur page with more details. - [Start with the backend API](#start-with-the-backend-api) - [Create a Tanstack-driven frontend](#create-tanstack-driven-frontend) - [Next steps](#next-steps) Feel free to skip directly to [the source code](https://github.com/denoland/examples/tree/main/with-tanstack) or follow along below! ## Start with the backend API Within our main directory, let's setup an `api/` directory and create our dinosaur data file, `api/data.json`: ```jsonc // api/data.json [ { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." }, { "name": "Abelisaurus", "description": "\"Abel's lizard\" has been reconstructed from a single skull." }, { "name": "Abrictosaurus", "description": "An early relative of Heterodontosaurus." }, ... ] ``` This is where our data will be pulled from. In a full application, this data would come from a database. > ⚠️️ In this tutorial we hard code the data. But you can connect > to [a variety of databases](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and [even use ORMs like Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) with > Deno. Secondly, let's create our [Hono](https://hono.dev/) server. We start by installing Hono from [JSR](https://jsr.io) with `deno add`: ```shell deno add jsr:@hono/hono ``` Next, let's create an `api/main.ts` file and populate it with the below. Note we'll need to import [`@hono/hono/cors`](https://hono.dev/docs/middleware/builtin/cors) and define key attributes to allow the frontend to access the API routes. ```ts // api/main.ts import { Hono } from "@hono/hono"; import { cors } from "@hono/hono/cors"; import data from "./data.json" with { type: "json" }; const app = new Hono(); app.use( "/api/*", cors({ origin: "http://localhost:5173", allowMethods: ["GET", "POST", "PUT", "DELETE"], allowHeaders: ["Content-Type", "Authorization"], exposeHeaders: ["Content-Type", "Authorization"], credentials: true, maxAge: 600, }), ); app.get("/", (c) => { return c.text("Welcome to the dinosaur API!"); }); app.get("/api/dinosaurs", (c) => { return c.json(data); }); app.get("/api/dinosaurs/:dinosaur", (c) => { if (!c.req.param("dinosaur")) { return c.text("No dinosaur name provided."); } const dinosaur = data.find((item) => item.name.toLowerCase() === c.req.param("dinosaur").toLowerCase() ); if (dinosaur) { return c.json(dinosaur); } else { return c.notFound(); } }); Deno.serve(app.fetch); ``` The Hono server provides two API endpoints: - `GET /api/dinosaurs` to fetch all dinosaurs, and - `GET /api/dinosaurs/:dinosaur` to fetch a specific dinosaur by name Before we start working on the frontend, let's update our `deno tasks` in our `deno.json` file. Yours should look something like this: ```jsonc { "tasks": { "dev": "deno --allow-env --allow-net api/main.ts" } // ... } ``` Now, the backend server will be started on `localhost:8000` when we run `deno task dev`. ## Create Tanstack-driven frontend Let's create the frontend that will use this data. First, we'll quickly scaffold a new React app with Vite using the TypeScript template in the current directory: ```shell deno init --npm vite@latest --template react-ts ./ ``` Then, we'll install our Tanstack-specific dependencies: ```shell deno install npm:@tanstack/react-query npm:@tanstack/react-router ``` Let's update our `deno tasks` in our `deno.json` to add a command to start the Vite server: ```jsonc // deno.json { "tasks": { "dev": "deno task dev:api & deno task dev:vite", "dev:api": "deno --allow-env --allow-net api/main.ts", "dev:vite": "deno -A npm:vite" } // ... } ``` We can move onto building our components. We'll need two main pages for our app: - `DinosaurList.tsx`: the index page, which will list out all the dinosaurs, and - `Dinosaur.tsx`: the leaf page, which displays information about a single dinosaur Let's create a new `./src/components` directory and, within that, the file `DinosaurList.tsx`: ```ts // ./src/components/DinosaurList.tsx import { useQuery } from "@tanstack/react-query"; import { Link } from "@tanstack/react-router"; async function fetchDinosaurs() { const response = await fetch("http://localhost:8000/api/dinosaurs"); if (!response.ok) { throw new Error("Failed to fetch dinosaurs"); } return response.json(); } export function DinosaurList() { const { data: dinosaurs, isLoading, error, } = useQuery({ queryKey: ["dinosaurs"], queryFn: fetchDinosaurs, }); if (isLoading) return
Loading...
; if (error instanceof Error) { return
An error occurred: {error.message}
; } return (

Dinosaur List

    {dinosaurs?.map((dino: { name: string; description: string }) => (
  • {dino.name}
  • ))}
); } ``` This uses [`useQuery`](https://tanstack.com/query/v4/docs/framework/react/guides/queries) from **Tanstack Query** to fetch and cache the dinosaur data automatically, with built-in loading and error states. Then it uses [`Link`](https://tanstack.com/router/v1/docs/framework/react/api/router/linkComponent) from **Tanstack Router** to create client-side navigation links with type-safe routing parameters. Next, let's create the `DinosaurDetail.tsx` component in the `./src/components/` folder, which will show details about a single dinosaur: ```ts // ./src/components/DinosaurDetail.tsx import { useParams } from "@tanstack/react-router"; import { useQuery } from "@tanstack/react-query"; async function fetchDinosaurDetail(name: string) { const response = await fetch(`http://localhost:8000/api/dinosaurs/${name}`); if (!response.ok) { throw new Error("Failed to fetch dinosaur detail"); } return response.json(); } export function DinosaurDetail() { const { name } = useParams({ from: "/dinosaur/$name" }); const { data: dinosaur, isLoading, error, } = useQuery({ queryKey: ["dinosaur", name], queryFn: () => fetchDinosaurDetail(name), }); if (isLoading) return
Loading...
; if (error instanceof Error) { return
An error occurred: {error.message}
; } return (

{name}

{dinosaur?.description}

); } ``` Again, this uses `useQuery` from **Tanstack Query** to fetch and cache individual dinosaur details, with [`queryKey`](https://tanstack.com/query/latest/docs/framework/react/guides/query-keys) including the dinosaur name to ensure proper caching. Additionally, we use [`useParams`](https://tanstack.com/router/v1/docs/framework/react/api/router/useParamsHook) from **Tanstack Router** to safely extract and type the URL parameters defined in our route configuration. Before we can run this, we need to encapsulate these components into a layout. Let's create another file in the `./src/components/` folder called `Layout.tsx`: ```ts // ./src/components/Layout.tsx export function Layout() { return (

Dinosaur Encyclopedia

); } ``` You may notice the [`Outlet`](https://tanstack.com/router/v1/docs/framework/react/guide/outlets) component towards the bottom of our newly created layout. This component is from **Tanstack Router** and renders the child route's content, allowing for nested routing while maintaining a consistent layout structure. Next, we'll have to wire up this layout with `./src/main.tsx`, which an important file that sets up the Tanstack Query client for managing server state and the Tanstack Router for handling navigation: ```ts // ./src/main.tsx import React from "react"; import ReactDOM from "react-dom/client"; import { QueryClient, QueryClientProvider } from "@tanstack/react-query"; import { createRouter, RouterProvider } from "@tanstack/react-router"; import { routeTree } from "./routeTree"; const queryClient = new QueryClient(); const router = createRouter({ routeTree }); declare module "@tanstack/react-router" { interface Register { router: typeof router; } } ReactDOM.createRoot(document.getElementById("root")!).render( , ); ``` You'll notice we import [`QueryClientProvider`](https://tanstack.com/query/latest/docs/framework/react/reference/QueryClientProvider), which wraps the entire application to allow for query caching and state management. We also import `RouterProvider`, which connects our defined routes to React's rendering system. Finally, we'll need to define a [`routeTree.tsx`](https://tanstack.com/router/v1/docs/framework/react/guide/route-trees) file in our `./src/` directory. This file defines our application's routing structure using Tanstack Router's type-safe route definitions: ```ts // ./src/routeTree.tsx import { RootRoute, Route } from "@tanstack/react-router"; import { DinosaurList } from "./components/DinosaurList"; import { DinosaurDetail } from "./components/DinosaurDetail"; import { Layout } from "./components/Layout"; const rootRoute = new RootRoute({ component: Layout, }); const indexRoute = new Route({ getParentRoute: () => rootRoute, path: "/", component: DinosaurList, }); const dinosaurRoute = new Route({ getParentRoute: () => rootRoute, path: "dinosaur/$name", component: DinosaurDetail, }); export const routeTree = rootRoute.addChildren([indexRoute, dinosaurRoute]); ``` In `./src/routeTree.tsx`, we create a hierarchy of routes with `Layout` as the root component. Then we set two child routes, their paths and components — one for the dinosaur list, `DinosaurList`, and the other for the individual dinosaur details with a dynamic parameter, `DinosaurDetail`. With all that complete, we can run this project: ```shell deno task dev ```
## Next steps This is just the beginning of building with Deno and Tanstack. You can add persistent data storage like [using a database like Postgres or MongoDB](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) and an ORM like [Drizzle](https://deno.com/blog/build-database-app-drizzle) or [Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/). Or deploy your app to [AWS](https://docs.deno.com/runtime/tutorials/aws_lightsail/), [Digital Ocean](https://docs.deno.com/runtime/tutorials/digital_ocean/), or [Google Cloud Run](https://docs.deno.com/runtime/tutorials/google_cloud_run/) You could also add real-time updates using [Tanstack Query's refetching capabilities](https://tanstack.com/query/latest/docs/framework/react/examples/auto-refetching), [implement infinite scrolling](https://tanstack.com/query/latest/docs/framework/react/examples/load-more-infinite-scroll) for large dinosaur lists, or [add complex filtering and sorting](https://tanstack.com/table/v8/docs/guide/column-filtering) using **[Tanstack Table](https://tanstack.com/table/latest)**. The combination of Deno's built-in web standards, tooling, and native TypeScript support, as well as Tanstack's powerful data management opens up numerous possibilities for building robust web applications. --- # Writing tests > Learn key concepts like test setup and structure, assertions, async testing, mocking, test fixtures, and code coverage URL: https://docs.deno.com/examples/tutorials/testing Testing is critical in software development to ensure your code works as expected, and continues to work as you make changes. Tests verify that your functions, modules, and applications behave correctly, handle edge cases appropriately, and maintain expected performance characteristics. ## Why testing matters Testing your code allows you to catch bugs, issues or regressions before they reach production, saving time and resources. Tests are also useful to help plan out the logic of your application, they can serve as a human readable description of how your code is meant to be used. Deno provides [built-in testing capabilities](/runtime/fundamentals/testing/), making it straightforward to implement robust testing practices in your projects. ## Writing tests with `Deno.test` Defining a test in Deno is straightforward - use the `Deno.test()` function to register your test with the test runner. This function accepts either a test name and function, or a configuration object with more detailed options. All test functions in files that match patterns like `*_test.{ts,js,mjs,jsx,tsx}` or `*.test.{ts,js,mjs,jsx,tsx}` are automatically discovered and executed when you run the `deno test` command. Here are the basic ways to define tests: ```ts // Basic test with a name and function Deno.test("my first test", () => { // Your test code here }); // Test with configuration options Deno.test({ name: "my configured test", fn: () => { // Your test code here }, ignore: false, // Optional: set to true to skip this test only: false, // Optional: set to true to only run this test permissions: { // Optional: specify required permissions read: true, write: false, }, }); ``` ### A simple example test Let's start with a simple test. Create a file called `main_test.ts`, in it we will test a basic addition operation using Deno's testing API and the `assertEquals` function from the [Deno Standard Library](https://jsr.io/@std). We use `Deno.test` and provide a name that describes what the test will do: ```ts title="main_test.ts" // hello_test.ts import { assertEquals } from "jsr:@std/assert"; // Function we want to test function add(a: number, b: number): number { return a + b; } Deno.test("basic addition test", () => { // Arrange - set up the test data const a = 1; const b = 2; // Act - call the function being tested const result = add(a, b); // Assert - verify the result is what we expect assertEquals(result, 3); }); ``` To run this test, use the `deno test` command: ```sh deno test hello_test.ts ``` You should see output indicating that your test has passed: ``` running 1 test from ./hello_test.ts basic addition test ... ok (2ms) test result: ok. 1 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out (2ms) ``` Try changing the function implementation to make the test fail: ```ts function add(a: number, b: number): number { return a - b; // Changed from addition to subtraction } ``` You'll see an error message that clearly shows what went wrong: ```sh running 1 test from ./hello_test.ts basic addition test ... FAILED (3ms) failures: basic addition test => ./hello_test.ts:12:3 error: AssertionError: Values are not equal: [Diff] Actual / Expected - -1 + 3 at assertEquals (https://jsr.io/@std/assert@0.218.2/assert_equals.ts:31:9) at Object.fn (file:///path/to/hello_test.ts:12:3) at asyncOpSanitizer (ext:core/01_core.js:199:13) at Object.sanitizeOps (ext:core/01_core.js:219:15) at runTest (ext:test/06_test_runner.js:319:29) at test (ext:test/06_test_runner.js:593:7) test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out (3ms) ``` This clear feedback helps you quickly identify and fix issues in your code. ## Test structure and organization Deno will automatically find and run tests that match naming patterns like `*_test.{ts,js,mjs,jsx,tsx}` or `*.test.{ts,js,mjs,jsx,tsx}`. There are plenty of ways to organize your test files, we recommend co-locating your unit tests with the code they are testing, and keeping integration tests and configuration in a `tests` directory. This allows for immediate discovery of unit tests and simplified imports, while keeping a separation between different types of tests. Here's an example of how you might structure your project with tests: ```sh my-deno-project/ ├── src/ │ ├── models/ │ │ ├── user.ts │ │ ├── user_test.ts // Unit tests for user model │ │ ├── product.ts │ │ └── product_test.ts // Unit tests for product model │ ├── services/ │ │ ├── auth-service.ts │ │ ├── auth-service_test.ts // Unit tests for auth service │ │ ├── data-service.ts │ │ └── data-service_test.ts // Unit tests for data service │ └── utils/ │ ├── helpers.ts │ └── helpers_test.ts // Unit tests for helpers ├── tests/ │ ├── integration/ // Integration tests directory │ │ ├── api_test.ts // Tests API endpoints │ │ └── db_test.ts // Tests database interactions │ ├── e2e/ // End-to-end tests │ │ └── user_flow_test.ts // Tests complete user workflows │ └── fixtures/ // Shared test data and utilities │ ├── test_data.ts // Test data used across tests │ └── setup.ts // Common setup functions ├── main.ts └── deno.json // Project configuration ``` This kind of structure offers a centralized place for test configuration while maintaining the benefits of co-locating unit tests with their relevant files. With this structure, you can: ```sh # Run all tests deno test # Run only unit tests deno test src/ # Run only integration tests deno test tests/integration/ # Run specific module tests deno test src/models/ # Run a specific test file deno test src/models/user_test.ts ``` ## Assertions Assertions are the building blocks of effective tests, allowing you to verify that your code behaves as expected. They check if a specific condition is true and throw an error if it's not, causing the test to fail. Good assertions are clear, specific, and help identify exactly what went wrong when a test fails. Deno doesn't include assertions in its core library, but you can import them from the [Deno standard library](https://jsr.io/@std/assert): ```ts import { assertArrayIncludes, // Check that array contains value assertEquals, // Check that values are equal assertExists, // Check that value is not null or undefined assertMatch, // Check that string matches regex pattern assertNotEquals, // Check that values are not equal assertObjectMatch, // Check that object has expected properties assertRejects, // Check that Promise rejects assertStrictEquals, // Check that values are strictly equal (===) assertStringIncludes, // Check that string contains substring assertThrows, // Check that function throws an error } from "jsr:@std/assert"; Deno.test("assertion examples", () => { // Basic assertions assertEquals(1 + 1, 2); assertNotEquals("hello", "world"); assertExists("Hello"); // String assertions assertStringIncludes("Hello, world!", "world"); assertMatch("deno@1.0.0", /^deno@\d+\.\d+\.\d+$/); // Object assertions assertObjectMatch( { name: "Jane", age: 25, city: "Tokyo" }, { name: "Jane" }, // Only checks specified properties ); // Strict equality (type + value) assertStrictEquals("deno", "deno"); // Error assertions assertThrows( () => { throw new Error("Something went wrong"); }, Error, "Something went wrong", ); }); ``` For those that prefer fluent assertions (familiar to users of Jest), you can use the `expect` module: ```ts import { expect } from "jsr:@std/expect"; Deno.test("expect style assertions", () => { // Basic matchers expect(5).toBe(5); expect({ name: "deno" }).toEqual({ name: "deno" }); // Collection matchers expect([1, 2, 3]).toContain(2); // Truthiness matchers expect(true).toBeTruthy(); expect(0).toBeFalsy(); expect(null).toBeNull(); expect(undefined).toBeUndefined(); // Number matchers expect(100).toBeGreaterThan(99); expect(1).toBeLessThan(2); // String matchers expect("Hello world").toMatch(/world/); // Function/error matchers expect(() => { throw new Error("fail"); }).toThrow(); }); ``` ### Real-world Example Here's a more realistic example testing a function that processes user data: ```ts // user_processor.ts export function validateUser(user: any): { valid: boolean; errors: string[] } { const errors: string[] = []; if (!user.name || typeof user.name !== "string") { errors.push("Name is required and must be a string"); } if (!user.email || !user.email.includes("@")) { errors.push("Valid email is required"); } if ( user.age !== undefined && (typeof user.age !== "number" || user.age < 18) ) { errors.push("Age must be a number and at least 18"); } return { valid: errors.length === 0, errors, }; } // user_processor_test.ts import { assertEquals } from "jsr:@std/assert"; import { validateUser } from "./user_processor.ts"; Deno.test("validateUser", async (t) => { await t.step("should validate a correct user object", () => { const user = { name: "John Doe", email: "john@example.com", age: 30, }; const result = validateUser(user); assertEquals(result.valid, true); assertEquals(result.errors.length, 0); }); await t.step("should return errors for invalid user", () => { const user = { name: "", email: "invalid-email", age: 16, }; const result = validateUser(user); assertEquals(result.valid, false); assertEquals(result.errors.length, 3); assertEquals(result.errors[0], "Name is required and must be a string"); assertEquals(result.errors[1], "Valid email is required"); assertEquals(result.errors[2], "Age must be a number and at least 18"); }); await t.step("should handle missing properties", () => { const user = { name: "Jane Doe", // email and age missing }; const result = validateUser(user); assertEquals(result.valid, false); assertEquals(result.errors.length, 1); assertEquals(result.errors[0], "Valid email is required"); }); }); ``` ## Async testing Deno handles async tests naturally. Just make your test function async and use await: ```ts import { assertEquals } from "jsr:@std/assert"; Deno.test("async test example", async () => { const response = await fetch("https://deno.land"); const status = response.status; assertEquals(status, 200); }); ``` ### Testing async functions When testing functions that return promises, you should always await the result: ```ts // async-function.ts export async function fetchUserData(userId: string) { const response = await fetch(`https://api.example.com/users/${userId}`); if (!response.ok) { throw new Error(`Failed to fetch user: ${response.status}`); } return await response.json(); } // async-function_test.ts import { assertEquals, assertRejects } from "jsr:@std/assert"; import { fetchUserData } from "./async-function.ts"; Deno.test("fetchUserData success", async () => { // Mock the fetch function for testing globalThis.fetch = async (url: string) => { const data = JSON.stringify({ id: "123", name: "Test User" }); return new Response(data, { status: 200 }); }; const userData = await fetchUserData("123"); assertEquals(userData.id, "123"); assertEquals(userData.name, "Test User"); }); Deno.test("fetchUserData failure", async () => { // Mock the fetch function to simulate an error globalThis.fetch = async (url: string) => { return new Response("Not Found", { status: 404 }); }; await assertRejects( async () => await fetchUserData("nonexistent"), Error, "Failed to fetch user: 404", ); }); ``` ## Mocking in tests Mocking is an essential technique for isolating the code being tested from its dependencies. Deno provides built-in utilities and third-party libraries for creating mocks. ### Basic Mocking You can create simple mocks by [replacing functions or objects with your own implementations](/examples/mocking_tutorial/). This allows you to control the behavior of dependencies and test how your code interacts with them. ```ts // Example of a module with a function we want to mock const api = { fetchData: async () => { const response = await fetch("https://api.example.com/data"); return response.json(); }, }; // In your test file Deno.test("basic mocking example", async () => { // Store the original function const originalFetchData = api.fetchData; // Replace with mock implementation api.fetchData = async () => { return { id: 1, name: "Test Data" }; }; try { // Test using the mock const result = await api.fetchData(); assertEquals(result, { id: 1, name: "Test Data" }); } finally { // Restore the original function api.fetchData = originalFetchData; } }); ``` ### Using Spy Functions Spies allow you to track function calls without changing their behavior: ```ts import { spy } from "jsr:@std/testing/mock"; Deno.test("spy example", () => { // Create a spy on console.log const consoleSpy = spy(console, "log"); // Call the function we're spying on console.log("Hello"); console.log("World"); // Verify the function was called correctly assertEquals(consoleSpy.calls.length, 2); assertEquals(consoleSpy.calls[0].args, ["Hello"]); assertEquals(consoleSpy.calls[1].args, ["World"]); // Restore the original function consoleSpy.restore(); }); ``` For more advanced mocking techniques, check our [dedicated guide on mocking in Deno](/examples/mocking_tutorial/). ## Coverage Code coverage is a metric that helps you understand how much of your code is being tested. It measures which lines, functions, and branches of your code are executed during your tests, giving you insight into areas that might lack proper testing. Coverage analysis helps you to: - Identify untested parts of your codebase - Ensure critical paths have tests - Prevent regressions when making changes - Measure testing progress over time :::note High coverage doesn't guarantee high-quality tests. It simply shows what code was executed, not whether your assertions are meaningful or if edge cases are handled correctly. ::: Deno provides built-in coverage tools to help you analyze your test coverage. To collect coverage information: ```bash deno test --coverage=coverage_dir ``` This generates coverage data in a specified directory (here, `coverage_dir`). To view a human-readable report: ```bash deno coverage coverage_dir ``` You'll see output like: ```sh file:///projects/my-project/src/utils.ts 85.7% (6/7) file:///projects/my-project/src/models/user.ts 100.0% (15/15) file:///projects/my-project/src/services/auth.ts 78.3% (18/23) total: 87.5% (39/45) ``` For more detailed insights, you can also generate an HTML report: ```bash deno coverage --html coverage_dir ``` This creates an interactive HTML report in the specified directory that shows exactly which lines are covered and which are not. By default, the coverage tool automatically excludes: - Test files (matching patterns like `test.ts` or `test.js`) - Remote files (those not starting with `file:`) This ensures your coverage reports focus on your application code rather than test files or external dependencies. ### Coverage Configuration You can exclude files from coverage reports by using the `--exclude` flag: ```bash deno coverage --exclude="test_,vendor/,_build/,node_modules/" coverage_dir ``` ### Integrating with CI For continuous integration environments, you might want to enforce a minimum coverage threshold: ```yaml # In your GitHub Actions workflow - name: Run tests with coverage run: deno test --coverage=coverage_dir - name: Check coverage meets threshold run: | COVERAGE=$(deno coverage coverage_dir | grep "total:" | grep -o '[0-9]\+\.[0-9]\+') if (( $(echo "$COVERAGE < 80" | bc -l) )); then echo "Test coverage is below 80%: $COVERAGE%" exit 1 fi ``` When working on your test coverage, remember to set realistic goals, aim for meaningful coverage with high quality tests over 100% coverage. ## Comparison with other testing frameworks If you're coming from other JavaScript testing frameworks, here's how Deno's testing capabilities compare: | Feature | Deno | Jest | Mocha | Jasmine | | ------------- | ---------------- | ---------------------- | -------------------------- | --------------------- | | Setup | Built-in | Requires installation | Requires installation | Requires installation | | Syntax | `Deno.test()` | `test()`, `describe()` | `it()`, `describe()` | `it()`, `describe()` | | Assertions | From std library | Built-in expect | Requires assertion library | Built-in expect | | Mocking | From std library | Built-in jest.mock() | Requires sinon or similar | Built-in spies | | Async support | Native | Needs special handling | Supports promises | Supports promises | | File watching | `--watch` flag | watch mode | Requires nodemon | Requires extra tools | | Code coverage | Built-in | Built-in | Requires istanbul | Requires istanbul | ### Testing Style Comparison **Deno:** ```ts import { assertEquals } from "jsr:@std/assert"; Deno.test("add function", () => { assertEquals(1 + 2, 3); }); ``` **Jest:** ```ts test("add function", () => { expect(1 + 2).toBe(3); }); ``` **Mocha:** ```ts import { assert } from "chai"; describe("math", () => { it("should add numbers", () => { assert.equal(1 + 2, 3); }); }); ``` **Jasmine:** ```ts describe("math", () => { it("should add numbers", () => { expect(1 + 2).toBe(3); }); }); ``` ## Next steps 🦕 Deno's built-in testing capabilities make it easy to write and run tests without needing to install extra testing frameworks or tools. By following the patterns and practices outlined in this tutorial, you can ensure your Deno applications are well-tested and reliable. For more information about testing in Deno, check out: - [Testing documentation](/runtime/fundamentals/testing) - [Mocking data for tests](/examples/mocking_tutorial/) - [Writing benchmark tests](/examples/benchmarking/) --- # Build a Typesafe API with tRPC and Deno > A guide to building type-safe APIs with tRPC and Deno. Learn how to set up endpoints, implement RPC procedures, handle data validation, and create efficient client-server applications. URL: https://docs.deno.com/examples/tutorials/trpc Deno is an [all-in-one, zero-config toolchain](https://docs.deno.com/runtime/manual/tools) for writing JavaScript and [TypeScript](https://docs.deno.com/runtime/fundamentals/typescript/) with [natively supports Web Platform APIs](https://docs.deno.com/runtime/reference/web_platform_apis/), making it an ideal choice for quickly building backends and APIs. To make our API easier to maintain, we can use [tRPC](https://trpc.io/), a TypeScript RPC ([Remote Procedure Call](https://en.wikipedia.org/wiki/Remote_procedure_call)) framework that enables you to build fully type-safe APIs without schema declarations or code generation. In this tutorial, we'll build a simple type-safe API with tRPC and Deno that returns information about dinosaurs: - [Set up tPRC](#set-up-trpc) - [Set up the server](#set-up-the-trpc-server) - [Set up the client](#set-up-the-trpc-client) - [What's next?](#whats-next) You can find all the code for this tutorial in [this GitHub repo](https://github.com/denoland/examples/tree/main/with-trpc). ## Set up tRPC To get started with tRPC in Deno, we'll need to install the required dependencies. Thanks to Deno's npm compatibility, we can use the npm versions of tRPC packages along with Zod for input validation: ```bash deno install npm:@trpc/server@next npm:@trpc/client@next npm:zod jsr:@std/path ``` This installs the most recent tRPC server and client packages, [Zod](https://zod.dev/) for runtime type validation, and [the Deno Standard Library's `path`](https://jsr.io/@std/path) utility. These packages will allow us to build a type-safe API layer between our client and server code. This will create a `deno.json` file in the project root to manage the npm and [jsr](https://jsr.io/) dependencies: ```tsx { "imports": { "@std/path": "jsr:@std/path@^1.0.6", "@trpc/client": "npm:@trpc/client@^11.0.0-rc.593", "@trpc/server": "npm:@trpc/server@^11.0.0-rc.593", "zod": "npm:zod@^3.23.8" } } ``` ## Set up the tRPC server The first step in building our tRPC application is setting up the server. We'll start by initializing tRPC and creating our base router and procedure builders. These will be the foundation for defining our API endpoints. Create a `server/trpc.ts` file: ```tsx // server/trpc.ts import { initTRPC } from "@trpc/server"; /** * Initialization of tRPC backend * Should be done only once per backend! */ const t = initTRPC.create(); /** * Export reusable router and procedure helpers * that can be used throughout the router */ export const router = t.router; export const publicProcedure = t.procedure; ``` This initializes tRPC and exports the router and procedure builders that we'll use to define our API endpoints. The `publicProcedure` allows us to create endpoints that don't require authentication. Next, we'll create a simple data layer to manage our dinosaur data. Create a `server/db.ts` file with the below: ```tsx // server/db.ts import { join } from "@std/path"; type Dino = { name: string; description: string }; const dataPath = join("data", "data.json"); async function readData(): Promise { const data = await Deno.readTextFile(dataPath); return JSON.parse(data); } async function writeData(dinos: Dino[]): Promise { await Deno.writeTextFile(dataPath, JSON.stringify(dinos, null, 2)); } export const db = { dino: { findMany: () => readData(), findByName: async (name: string) => { const dinos = await readData(); return dinos.find((dino) => dino.name === name); }, create: async (data: { name: string; description: string }) => { const dinos = await readData(); const newDino = { ...data }; dinos.push(newDino); await writeData(dinos); return newDino; }, }, }; ``` This creates a simple file-based database that reads and writes dinosaur data to a JSON file. In a production environment, you'd typically use a proper database, but this will work well for our demo. > ⚠️️ In this tutorial, we hard code data and use a file-based database. However, > you can > [connect to a variety of databases](https://docs.deno.com/runtime/tutorials/connecting_to_databases/) > and use ORMs like [Drizzle](https://docs.deno.com/examples/drizzle) or > [Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/). Finally, we'll need to provide the actual data. Let's create a `./data.json` file with some sample dinosaur data: ```tsx // data/data.json [ { "name": "Aardonyx", "description": "An early stage in the evolution of sauropods." }, { "name": "Abelisaurus", "description": "\"Abel's lizard\" has been reconstructed from a single skull." }, { "name": "Abrictosaurus", "description": "An early relative of Heterodontosaurus." }, { "name": "Abrosaurus", "description": "A close Asian relative of Camarasaurus." }, ... ] ``` Now, we can create our main server file that defines our tRPC router and procedures. Create a `server/index.ts` file: ```tsx // server/index.ts import { createHTTPServer } from "@trpc/server/adapters/standalone"; import { z } from "zod"; import { db } from "./db.ts"; import { publicProcedure, router } from "./trpc.ts"; const appRouter = router({ dino: { list: publicProcedure.query(async () => { const dinos = await db.dino.findMany(); return dinos; }), byName: publicProcedure.input(z.string()).query(async (opts) => { const { input } = opts; const dino = await db.dino.findByName(input); return dino; }), create: publicProcedure .input(z.object({ name: z.string(), description: z.string() })) .mutation(async (opts) => { const { input } = opts; const dino = await db.dino.create(input); return dino; }), }, examples: { iterable: publicProcedure.query(async function* () { for (let i = 0; i < 3; i++) { await new Promise((resolve) => setTimeout(resolve, 500)); yield i; } }), }, }); // Export type router type signature, this is used by the client. export type AppRouter = typeof appRouter; const server = createHTTPServer({ router: appRouter, }); server.listen(3000); ``` This sets up three main endpoints: - `dino.list`: Returns all dinosaurs - `dino.byName`: Returns a specific dinosaur by name - `dino.create`: Creates a new dinosaur - `examples.iterable`: A demonstration of tRPC's support for async iterables The server is configured to listen on port 3000 and will handle all tRPC requests. While you can run the server now, you won't be able to access any of the routes and have it return data. Let's fix that! ## Set up the tRPC client With our server ready, we can create a client that consumes our API with full type safety. Create a `client/index.ts` file: ```tsx // client/index.ts /** * This is the client-side code that uses the inferred types from the server */ import { createTRPCClient, splitLink, unstable_httpBatchStreamLink, unstable_httpSubscriptionLink, } from "@trpc/client"; /** * We only import the `AppRouter` type from the server - this is not available at runtime */ import type { AppRouter } from "../server/index.ts"; // Initialize the tRPC client const trpc = createTRPCClient({ links: [ splitLink({ condition: (op) => op.type === "subscription", true: unstable_httpSubscriptionLink({ url: "http://localhost:3000", }), false: unstable_httpBatchStreamLink({ url: "http://localhost:3000", }), }), ], }); const dinos = await trpc.dino.list.query(); console.log("Dinos:", dinos); const createdDino = await trpc.dino.create.mutate({ name: "Denosaur", description: "A dinosaur that lives in the deno ecosystem. Eats Nodes for breakfast.", }); console.log("Created dino:", createdDino); const dino = await trpc.dino.byName.query("Denosaur"); console.log("Denosaur:", dino); const iterable = await trpc.examples.iterable.query(); for await (const i of iterable) { console.log("Iterable:", i); } ``` This client code demonstrates several key features of tRPC: 1. **Type inference from the server router**. The client automatically inherits all type definitions from the server through the `AppRouter` type import. This means you get complete type support and compile-time type checking for all your API calls. If you modify a procedure on the server, TypeScript will immediately flag any incompatible client usage. 2. **Making queries and mutations**. The example demonstrates two types of API calls: Queries (`list` and `byName`) used for fetching data without side effects, and mutations (`create`) used for operations that modify server-side state. The client automatically knows the input and output types for each procedure, providing type safety throughout the entire request cycle. 3. **Working with async iterables**. The `examples.iterable` demonstrates tRPC's support for streaming data using async iterables. This feature is particularly useful for real-time updates or processing large datasets in chunks. Now, let's start our server to see it in action. In our `deno.json` config file, let's create a new property `tasks` with the following commands: ```json { "tasks": { "start": "deno -A server/index.ts", "client": "deno -A client/index.ts" } // Other properties in deno.json remain the same. } ``` We can list our available tasks with `deno task`: ```bash deno task Available tasks: - start deno -A server/index.ts - client deno -A client/index.ts ``` Now, we can start the server with `deno task start`. After that's running, we can run the client with `deno task client`. You should see an output like this: ```bash deno task client Dinos: [ { name: "Aardonyx", description: "An early stage in the evolution of sauropods." }, { name: "Abelisaurus", description: "Abel's lizard has been reconstructed from a single skull." }, { name: "Abrictosaurus", description: "An early relative of Heterodontosaurus." }, ... ] Created dino: { name: "Denosaur", description: "A dinosaur that lives in the deno ecosystem. Eats Nodes for breakfast." } Denosaur: { name: "Denosaur", description: "A dinosaur that lives in the deno ecosystem. Eats Nodes for breakfast." } Iterable: 0 Iterable: 1 Iterable: 2 ``` Success! Running the `./client/index.ts` shows how to create a tRPC client and use its JavaScript API to interact with the database. But how can we check if the tRPC client is inferring the right types from the database? Let's modify the code snippet below in `./client/index.ts` to pass a `number` instead of a `string` as the `description`: ```diff // ... const createdDino = await trpc.dino.create.mutate({ name: "Denosaur", description: - "A dinosaur that lives in the deno ecosystem. Eats Nodes for breakfast.", + 100, }); console.log("Created dino:", createdDino); // ... ``` When we re-run the client: ```bash deno task client ... error: Uncaught (in promise) TRPCClientError: [ { "code": "invalid_type", "expected": "string", "received": "number", "path": [ "description" ], "message": "Expected string, received number" } ] at Function.from (file:///Users/andyjiang/Library/Caches/deno/npm/registry.npmjs.org/@trpc/client/11.0.0-rc.608/dist/TRPCClientError.mjs:35:20) at file:///Users/andyjiang/Library/Caches/deno/npm/registry.npmjs.org/@trpc/client/11.0.0-rc.608/dist/links/httpBatchStreamLink.mjs:118:56 at eventLoopTick (ext:core/01_core.js:175:7) ``` tRPC successfully threw an `invalid_type` error, since it was expecting a `string` instead of a `number`. ## What’s next? Now that you have a basic understanding of how to use tRPC with Deno, you could: 1. Build out an actual frontend using [Next.js](https://trpc.io/docs/client/nextjs) or [React](https://trpc.io/docs/client/react) 2. [Add authentication to your API using tRPC middleware](https://trpc.io/docs/server/middlewares#authorization) 3. Implement real-time features [using tRPC subscriptions](https://trpc.io/docs/server/subscriptions) 4. Add [input validation](https://trpc.io/docs/server/validators) for more complex data structures 5. Integrate with a proper database like [PostgreSQL](https://docs.deno.com/runtime/tutorials/connecting_to_databases/#postgres) or use an ORM like [Drizzle](https://docs.deno.com/examples/drizzle) or [Prisma](https://docs.deno.com/runtime/tutorials/how_to_with_npm/prisma/) 6. Deploy your application to [Deno Deploy](https://deno.com/deploy) or [any public cloud via Docker](https://docs.deno.com/runtime/tutorials/#deploying-deno-projects) 🦕 Happy type safety coding with Deno and tRPC! --- # Build a Vue.js App > A tutorial on building Vue.js applications with Deno. Learn how to set up a Vite project, implement component architecture, add routing, manage state, and create a full-stack TypeScript application. URL: https://docs.deno.com/examples/tutorials/vue [Vue.js](https://vuejs.org/) is a progressive front-end JavaScript framework. It provides tools and features for creating dynamic and interactive user interfaces. In this tutorial we'll build a simple Vue.js app with Vite and Deno. The app will display a list of dinosaurs. When you click on one, it'll take you to a dinosaur page with more details. You can see the [finished app on GitHub](https://github.com/denoland/tutorial-with-vue). ![The Vue.js app in action](./images/how-to/vue/vue.gif) ## Create a Vue.js app with Vite and Deno We'll use [Vite](https://vitejs.dev/) to scaffold a basic Vue.js app. In your terminal, run the following command to create a new .js app with Vite: ```shell deno run -A npm:create-vite ``` When prompted, give your app a name and select `Vue` from the offered frameworks and `TypeScript` as a variant. Once created, `cd` into your new project and run the following command to install dependencies: ```shell deno install ``` Then, run the following command to serve your new Vue.js app: ```shell deno task dev ``` Deno will run the `dev` task from the `package.json` file which will start the Vite server. Click the output link to localhost to see your app in the browser. ## Configure the formatter `deno fmt` supports Vue files with the [`--unstable-component`](https://docs.deno.com/runtime/reference/cli/fmt/#formatting-options-unstable-component) flag. To use it, run this command: ```sh deno fmt --unstable-component ``` To configure `deno fmt` to always format your Vue files, add this at the top level of your `deno.json` file: ```json "unstable": ["fmt-component"] ``` ## Add a backend The next step is to add a backend API. We'll create a very simple API that returns information about dinosaurs. In the root of your new vite project, create an `api` folder. In that folder, create a `main.ts` file, which will run the server, and a `data.json`, which where we'll put the hard coded data. Copy and paste [this json file](https://raw.githubusercontent.com/denoland/tutorial-with-vue/refs/heads/main/api/data.json) into `api/data.json`. We're going to build out a simple API server with routes that return dinosaur information. We'll use the [`oak` middleware framework](https://jsr.io/@oak/oak) and the [`cors` middleware](https://jsr.io/@tajpouria/cors) to enable [CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS). Use the `deno add` command to add the required dependencies to your project: ```shell deno add jsr:@oak/oak jsr:@tajpouria/cors ``` Next, update `api/main.ts` to import the required modules and create a new `Router` instance to define some routes: ```ts title="main.ts" import { Application, Router } from "@oak/oak"; import { oakCors } from "@tajpouria/cors"; import data from "./data.json" with { type: "json" }; const router = new Router(); ``` After this, in the same file, we'll define three routes. The first route at `/` will return the string `Welcome to the dinosaur API`, then we'll set up `/dinosaurs` to return all the dinosaurs, and finally `/dinosaurs/:dinosaur` to return a specific dinosaur based on the name in the URL: ```ts title="main.ts" router .get("/", (context) => { context.response.body = "Welcome to dinosaur API!"; }) .get("/dinosaurs", (context) => { context.response.body = data; }) .get("/dinosaurs/:dinosaur", (context) => { if (!context?.params?.dinosaur) { context.response.body = "No dinosaur name provided."; } const dinosaur = data.find((item) => item.name.toLowerCase() === context.params.dinosaur.toLowerCase() ); context.response.body = dinosaur ? dinosaur : "No dinosaur found."; }); ``` Finally, at the bottom of the same file, create a new `Application` instance and attach the routes we just defined to the application using `app.use(router.routes())` and start the server listening on port 8000: ```ts title="main.ts" const app = new Application(); app.use(oakCors()); app.use(router.routes()); app.use(router.allowedMethods()); await app.listen({ port: 8000 }); ``` You can run the API server with `deno run --allow-env --allow-net api/main.ts`. We'll create a task to run this command and update the dev task to run both the Vue.js app and the API server. In your `package.json` file, update the `scripts` field to include the following: ```jsonc { "scripts": { "dev": "deno task dev:api & deno task dev:vite", "dev:api": "deno run --allow-env --allow-net api/main.ts", "dev:vite": "deno run -A npm:vite", // ... } ``` Now, if you run `deno task dev` and visit `localhost:8000`, in your browser you should see the text `Welcome to dinosaur API!`, and if you visit `localhost:8000/dinosaurs`, you should see a JSON response of all of the dinosaurs. ## Build the frontend ### The entrypoint and routing In the `src` directory, you'll find a `main.ts` file. This is the entry point for the Vue.js app. Our app will have multiple route, so we'll need a router to do our client-side routing. We'll use the official [Vue Router](https://router.vuejs.org/) for this. Update `src/main.ts` to import and use the router: ```ts import { createApp } from "vue"; import router from "./router/index.ts"; import "./style.css"; import App from "./App.vue"; createApp(App) .use(router) .mount("#app"); ``` Add the Vue Router module to the project with `deno add`: ```shell deno add npm:vue-router ``` Next, create a `router` directory in the `src` directory. In it, create an `index.ts` file with the following content: ```ts title="router/index.ts" import { createRouter, createWebHistory } from "vue-router"; import HomePage from "../components/HomePage.vue"; import Dinosaur from "../components/Dinosaur.vue"; export default createRouter({ history: createWebHistory("/"), routes: [ { path: "/", name: "Home", component: HomePage, }, { path: "/:dinosaur", name: "Dinosaur", component: Dinosaur, props: true, }, ], }); ``` This will set up a router with two routes: `/` and `/:dinosaur`. The `HomePage` component will be rendered at `/` and the `Dinosaur` component will be rendered at `/:dinosaur`. Finally, you can delete all of the code in the `src/App.vue` file to and update it to include only the `` component: ```html title="App.vue" ``` ### The components Vue.js splits the frontend UI into components. Each component is a reusable piece of code. We'll create three components: one for the home page, one for the list of dinosaurs, and one for an individual dinosaur. Each component file is split into three parts: ` ``` This code uses the Vue.js [v-for](https://vuejs.org/api/built-in-directives.html#v-for) directive to iterate over the `dinosaurs` array and render each dinosaur as a `RouterLink` component. The `:to` attribute of the `RouterLink` component specifies the route to navigate to when the link is clicked, and the `:key` attribute is used to uniquely identify each dinosaur. #### The Homepage component The homepage will contain a heading and then it will render the `Dinosaurs` component. Add the following code to the `HomePage.vue` file: ```html title="HomePage.vue" ``` Because the `Dinosaurs` component fetches data asynchronously, use the [`Suspense` component](https://vuejs.org/guide/built-ins/suspense.html) to handle the loading state. #### The Dinosaur component The `Dinosaur` component will display the name and description of a specific dinosaur and a link to go back to the full list. First, we'll set up some types for the data we'll be fetching. Create a `types.ts` file in the `src` directory and add the following code: ```ts title="types.ts" type Dinosaur = { name: string; description: string; }; type ComponentData = { dinosaurDetails: null | Dinosaur; }; ``` Then update the `Dinosaur.vue` file: ```html title="Dinosaur.vue" ``` This code uses the `props` option to define a prop named `dinosaur` that will be passed to the component. The `mounted` lifecycle hook is used to fetch the details of the dinosaur based on the `dinosaur` prop and store them in the `dinosaurDetails` data property. This data is then rendered in the template. ## Run the app Now that we've set up the frontend and backend, we can run the app. In your terminal, run the following command: ```shell deno task dev ``` Visit the output localhost link in your browser to see the app. Click on a dinosaur to see more details! ![The vue app in action](./images/how-to/vue/vue.gif) 🦕 Now that you can run a Vue app in Deno with Vite you're ready to build real world applications! If you'd like to expand upon this demo you could consider building out a backend server to serve the static app once built, then you'll be able to [deploy your dinosaur app to the cloud](https://docs.deno.com/deploy/manual/). --- # Testing web apps > A comprehensive guide to testing web applications with Deno URL: https://docs.deno.com/examples/tutorials/web_testing Deno is a JavaScript runtime that operates outside of the browser, as such, you cannot directly manipulate the Document Object Model in Deno as you would in a browser. However you can use a library like [deno-dom](https://jsr.io/@b-fuze/deno-dom), [JSDom](https://github.com/jsdom/jsdom) or [LinkeDOM](https://www.npmjs.com/package/linkedom) to work with the DOM. This tutorial will guide you through how to effectively test your web applications using Deno. ## Testing UI components and DOM manipulation Let's say you have a website that shows a uers's profile, you can set up a test function to verify that the DOM element creation works correctly. This code sets up a basic card element then tests whether the created DOM structure matches what was expected. ```ts import { assertEquals } from "jsr:@std/assert"; import { DOMParser, Element } from "jsr:@b-fuze/deno-dom"; // Component or function that manipulates the DOM function createUserCard(user: { name: string; email: string }): Element { const doc = new DOMParser().parseFromString("
", "text/html")!; const card = doc.createElement("div"); card.className = "user-card"; const name = doc.createElement("h2"); name.textContent = user.name; card.appendChild(name); const email = doc.createElement("p"); email.textContent = user.email; email.className = "email"; card.appendChild(email); return card; } Deno.test("DOM manipulation test", () => { // Create a test user const testUser = { name: "Test User", email: "test@example.com" }; // Call the function const card = createUserCard(testUser); // Assert the DOM structure is correct assertEquals(card.className, "user-card"); assertEquals(card.children.length, 2); assertEquals(card.querySelector("h2")?.textContent, "Test User"); assertEquals(card.querySelector(".email")?.textContent, "test@example.com"); }); ``` ## Testing Event Handling Web applications often handle user interactions through events. Here's how to test event handlers. This code sets up a button that tracks its active/inactive state and updates its appearance when clicked. The accompanying test verifies the toggle functionality by creating a button, checking its initial state, simulating clicks, and asserting that the button correctly updates its state after each interaction: ```ts import { DOMParser } from "jsr:@b-fuze/deno-dom"; import { assertEquals } from "jsr:@std/assert"; // Component with event handling function createToggleButton(text: string) { const doc = new DOMParser().parseFromString("
", "text/html")!; const button = doc.createElement("button"); button.textContent = text; button.dataset.active = "false"; button.addEventListener("click", () => { const isActive = button.dataset.active === "true"; button.dataset.active = isActive ? "false" : "true"; button.classList.toggle("active", !isActive); }); return button; } Deno.test("event handling test", () => { // Create button const button = createToggleButton("Toggle Me"); // Initial state assertEquals(button.dataset.active, "false"); assertEquals(button.classList.contains("active"), false); // Simulate click event button.dispatchEvent(new Event("click")); // Test after first click assertEquals(button.dataset.active, "true"); assertEquals(button.classList.contains("active"), true); // Simulate another click button.dispatchEvent(new Event("click")); // Test after second click assertEquals(button.dataset.active, "false"); assertEquals(button.classList.contains("active"), false); }); ``` ## Testing Fetch Requests Testing components that make network requests requires mocking the fetch API. In the below example we will [mock](/examples/mocking_tutorial/) the `fetch` API to test a function that retrieves user data from an external API. The test creates a spy function that returns predefined responses based on the requested URL, allowing you to test both successful requests and error handling without making actual network calls: ```ts import { assertSpyCalls, spy } from "jsr:@std/testing/mock"; import { assertEquals } from "jsr:@std/assert"; // Component that fetches data async function fetchUserData( userId: string, ): Promise<{ name: string; email: string }> { const response = await fetch(`https://api.example.com/users/${userId}`); if (!response.ok) { throw new Error(`Failed to fetch user: ${response.status}`); } return await response.json(); } Deno.test("fetch request test", async () => { // Mock fetch response const originalFetch = globalThis.fetch; const mockFetch = spy(async (input: RequestInfo | URL): Promise => { const url = input.toString(); if (url === "https://api.example.com/users/123") { return new Response( JSON.stringify({ name: "John Doe", email: "john@example.com" }), { status: 200, headers: { "Content-Type": "application/json" } }, ); } return new Response("Not found", { status: 404 }); }); // Replace global fetch with mock globalThis.fetch = mockFetch; try { // Call the function with a valid ID const userData = await fetchUserData("123"); // Assert the results assertEquals(userData, { name: "John Doe", email: "john@example.com" }); assertSpyCalls(mockFetch, 1); // Test error handling (optional) try { await fetchUserData("invalid"); throw new Error("Should have thrown an error for invalid ID"); } catch (error) { assertEquals((error as Error).message, "Failed to fetch user: 404"); } assertSpyCalls(mockFetch, 2); } finally { // Restore the original fetch globalThis.fetch = originalFetch; } }); ``` ## Using Testing Steps to set up and teardown For complex tests, you can use steps to organize test logic into discrete sections, making tests more readable and maintainable. Steps also enable better isolation between different parts of your test. Using step naming you can implement a setup and teardown of the test conditions. ```ts import { DOMParser } from "jsr:@b-fuze/deno-dom"; import { assertEquals, assertExists } from "jsr:@std/assert"; Deno.test("complex web component test", async (t) => { const doc = new DOMParser().parseFromString( "", "text/html", ); const body = doc.createElement("body"); const container = doc.createElement("div"); body.appendChild(container); await t.step("initial rendering", () => { container.innerHTML = `
`; const app = container.querySelector("#app"); assertExists(app); assertEquals(app.children.length, 0); }); await t.step("adding content", () => { const app = container.querySelector("#app"); assertExists(app); const header = doc.createElement("header"); header.textContent = "My App"; app.appendChild(header); assertEquals(app.children.length, 1); assertEquals(app.firstElementChild?.tagName.toLowerCase(), "header"); }); await t.step("responding to user input", () => { const app = container.querySelector("#app"); assertExists(app); const button = doc.createElement("button"); button.textContent = "Click me"; button.id = "test-button"; app.appendChild(button); let clickCount = 0; button.addEventListener("click", () => clickCount++); button.dispatchEvent(new Event("click")); button.dispatchEvent(new Event("click")); assertEquals(clickCount, 2); }); await t.step("removing content", () => { const app = container.querySelector("#app"); assertExists(app); const header = app.querySelector("header"); assertExists(header); header.remove(); assertEquals(app.children.length, 1); // Only the button should remain }); }); ``` ## Best Practices for Web Testing in Deno 1. Maintain isolation - Each test should be self-contained and not depend on other tests. 2. Use names to show intent - descriptive names for tests make it clear what is being tested and give more readable output in the console 3. Clean up after your tests - remove any DOM elements created during tests to prevent test pollution. 4. Mock external services (such as APIs) to make tests faster and more reliable. 5. Organize tests into logical steps using `t.step()` for complex components. ## Running Your Tests Execute your tests with the Deno test command: ```bash deno test ``` For web tests, you might need additional permissions: ```bash deno test --allow-net --allow-read --allow-env ``` 🦕 By following the patterns in this tutorial, you can write comprehensive tests for your web applications that verify both functionality and user experience. Remember that effective testing leads to more robust applications and helps catch issues before they reach your users. --- # Building a word finder app with Deno > A tutorial on creating a word search application with Deno. Learn how to build a web server, implement pattern matching, handle HTTP requests, and create an interactive web interface using Oak framework. URL: https://docs.deno.com/examples/tutorials/word_finder ## Getting Started In this tutorial we'll create a simple Word Finder web application using Deno. No prior knowledge of Deno is required. ## Introduction Our Word Finder application will take a pattern string provided by the user and return all words in the English dictionary that match the pattern. The pattern can include alphabetical characters as well as `_` and `?`. The `?` can stand for any letter that isn't present in the pattern. `_` can stand for any letter. For example, the pattern `c?t` matches "cat" and "cut". The pattern `go?d` matches the words "goad" and "gold" (but not "good"). ![Word finder UI](./images/word_finder.png) ## Building the View The function below renders the HTML that creates the simple UI displayed above. You can specify a pattern and list of words to customize the HTML content. If a pattern is specified then it will show up in the search text box. If the word list is specified, then a bulleted list of words will be rendered. ```jsx title="render.js" export function renderHtml(pattern, words) { let searchResultsContent = ""; if (words.length > 0) { let wordList = ""; for (const word of words) { wordList += `
  • ${word}
  • `; } searchResultsContent = `

    Words found: ${words.length}

      ${wordList}
    `; } return ` Deno Word Finder

    Deno Word Finder

    ${searchResultsContent}

    Instructions

    Enter a word using _ and ? as needed for unknown characters. Using ? means to include letters that aren't already used (you can think of it as a "Wheel of Fortune" placeholder). Using _ will find words that contain any character (whether it's currently "revealed" or not).

    For example, d__d would return:

    • dand
    • daud
    • dead
    • deed
    • dird
    • dodd
    • dowd
    • duad
    • dyad

    And go?d would return:
    • goad
    • gold

    `; } ``` ## Searching the Dictionary We also need a simple search function which scans the dictionary and returns all words that match the specified pattern. The function below takes a pattern and dictionary and then returns all matched words. ```jsx title="search.js" export function search(pattern, dictionary) { // Create regex pattern that excludes characters already present in word let excludeRegex = ""; for (let i = 0; i < pattern.length; i++) { const c = pattern[i]; if (c != "?" && c != "_") { excludeRegex += "^" + c; } } excludeRegex = "[" + excludeRegex + "]"; // Let question marks only match characters not already present in word let searchPattern = pattern.replace(/\?/g, excludeRegex); // Let underscores match anything searchPattern = "^" + searchPattern.replace(/\_/g, "[a-z]") + "$"; // Find all words in dictionary that match pattern let matches = []; for (let i = 0; i < dictionary.length; i++) { const word = dictionary[i]; if (word.match(new RegExp(searchPattern))) { matches.push(word); } } return matches; } ``` ## Running a Deno Server [Oak](https://jsr.io/@oak/oak) is a framework that lets you easily setup a server in Deno (analogous to JavaScript's Express) and we'll be using it to host our application. Our server will use our search function to populate our HTML template with data and then return the customized HTML back to the viewer. We can conveniently rely on the `/usr/share/dict/words` file as our dictionary which is a standard file present on most Unix-like operating systems. ```jsx title="server.js" import { Application, Router } from "jsr:@oak/oak"; import { search } from "./search.js"; import { renderHtml } from "./render.js"; const dictionary = (await Deno.readTextFile("/usr/share/dict/words")).split( "\n", ); const app = new Application(); const port = 8080; const router = new Router(); router.get("/", async (ctx) => { ctx.response.body = renderHtml("", []); }); router.get("/api/search", async (ctx) => { const pattern = ctx.request.url.searchParams.get("search-text"); ctx.response.body = renderHtml(pattern, search(pattern, dictionary)); }); app.use(router.routes()); app.use(router.allowedMethods()); console.log("Listening at http://localhost:" + port); await app.listen({ port }); ``` We can start our server with the following command. Note we need to explicitly grant access to the file system and network because Deno is secure by default. ```bash deno run --allow-read --allow-net server.js ``` Now if you visit [http://localhost:8080](http://localhost:8080/) you should be able to view the Word Finder app. ## Example Code You can find the entire example code [here](https://github.com/awelm/deno-word-finder). --- # All-in-one tooling > Learn about Deno's built-in developer tools. Watch how to use the integrated formatter, linter, and test runner to improve code quality without additional configuration or third-party dependencies. URL: https://docs.deno.com/examples/videos/all-in-one_tooling ## Video description In Node.js, before we can get started working on our project, we have to go through a configuration step for things like linting, formatting, and testing. Deno saves us a ton of time by including these tools natively. Let's take a look at what's included with these built-in CLI tools. ## Transcript and code Here we have a function called sing: ```javascript function sing(phrase: string, times: number): string { return Array(times).fill(phrase).join(" "); } ``` Now let's run the formatter: ```shell deno fmt ``` The formatter automatically formats your code to follow Deno's rules and conventions. Let's run it to clean up any formatting issues. Deno even formats code snippets in markdown files. So anything that is enclosed in triple backticks will be formatted when you run this command as well. The deno lint command is used to analyze your code for potential issues. It’s similar to ESLint but built into Deno. ```shell deno lint --help ``` This will lint all of the JavaScript and TypeScript files in the current directory and in subdirectories. You can also lint specific files by passing their names ```shell # lint specific files deno lint myfile1.ts myfile2.ts ``` You can run it on specific directories ```shell deno lint src/ ``` And if you're feeling like you want to skip linting certain files, at the top of the files, you can add a comment, and deno will know to skip this one. ```javascript // deno-lint-ignore-file // deno-lint-ignore-file -- reason for ignoring ``` Deno also has some CLI commands for testing. In our directory here we have a test file. It uses the name of the function and test. ```javascript title="sing_test.ts" import { sing } from "./sing.ts"; import { assertEquals } from "jsr:@std/assert"; Deno.test("sing repeats a phrase", () => { const result = sing("La", 3); assertEquals(result, "La La La"); }); ``` Now, we’ll run our tests using the deno test command. Deno automatically discovers and runs test files. ```shell deno test ``` The way Deno decides which files should be considered test files is that it follows: `_test.ts`, `_test.js`, `_test.tsx`, `_test.jsx`, `.test.js`, `.test.ts`, `.test.tsx`, `.test.jsx` `deno test encourage.test.js` Or you can pass a specific directory path and Deno will search for test files in there. ```sh ./tests/ ``` You can even check code coverage. By default, when you run deno test \--coverage a coverage profile will be generated in the /coverage directory in the current working directory. ```shell deno test --coverage ``` From there you can run deno coverage to print a coverage report to standard output ```shell deno coverage ``` As you can see, Deno's built-in tools are pretty cool. We don't have to spend a whole day configuring these settings before we can start working on our project. And we can format, lint, and test code without the need for third-party dependencies. --- # Compatibility with Node & npm URL: https://docs.deno.com/examples/videos/backward_compat_with_node_npm ## Video description Explore how to integrate Deno into your existing Node.js projects seamlessly. In this video, we'll use Node.js standard libraries and npm modules with simple prefixes, maintain compatibility with CommonJS projects, and make use of Deno's features like dependency installation, formatting, and linting. Make the transition of your Node.js projects effortlessly without the need for major rewrites. ## Transcript and code Making the choice to use Deno does not mean that we can't take advantage of the Node.js ecosystem. It also doesn't mean that we have to rebuild all of our Node.js projects from scratch. Using the features of the standard library, or the npm ecosystem, is as simple as adding a prefix. If you want to learn more about the Node apis you can check out [the Node API documentation](/api/node/). Here's an example of Using Node's file system module with the promises API: ```typescript title="main.ts" async function readFile() { try { const data = await fs.readFile("example.txt", "utf8"); console.log(data); } catch (error) { console.error("Error reading file", error); } } readFile(); ``` We read the file and we console log the data. In node, we would import `fs` from `fs/promises` eg: ```typescript import fs from "fs/promises"; ``` In Deno, we just put the Node prefix in front of the import, eg: ```typescript import fs from "node:fs/promises"; ``` Then we run `deno main.ts` and opt into the "Running Deno with Node.js Built-in read access". If we run `deno main.ts` and allow [read access](/runtime/fundamentals/security/) its going to read from the file. Updating any imports in our apps to use this Node specifier will enable any code using node.js built-ins. Deno even supports CommonJS projects, which feels above and beyond I think that's pretty cool! What if we wanted to use an npm module, from say, Sentry, in our application. We're going to use the **npm colon specifier** this time: ```typescript title="main.ts" import * as Sentry from "npm:@sentry/node"; Sentry.init({ dsn: "https://example.com" }); function main() { try { throw new Error("This is an error"); } catch (error) { Sentry.captureException(error); console.error("Error caught", error); } } ``` We'll run the command: ```sh deno run main.ts ``` Which will ask for access to our home directory, and other places, and there we go! We are capturing this error as well! This backwards compatibility is pretty amazing. Are you working on an existing Node.js project? Well with Deno 2 you can do that too. You can use `deno install` to install dependencies you can `deno fmt` for formatting you can `deno lint` for linting we can even run `deno lint --fix` to fix any linting problems automatically. And yes you can also run Deno directly, so for any of the scripts that are part of a `package.json` just run `deno task` with the name of the script, eg: ```sh deno task dev ``` We can use all of the code that we've written before without having to change it or stretch it too much, Deno just makes it work! --- # Browser APIs in Deno > Explore web standard APIs in Deno's server-side environment. Learn how to use fetch, streams, text encoders, and other browser-compatible features while building modern applications with familiar web APIs. URL: https://docs.deno.com/examples/videos/browser_apis_in_deno ## Video description Deno wants to give developers the most browser-like programming environment possible. Deno uses web standard APIs, so if you're familiar with building for the web, then you're familiar with Deno. If not, when you learn how to use deno, you're also learning how to build for the web.  If you take a look at the docs, it gives you a good sense of what's available, so we got things like Canvas and internationalization and messaging and storage and streams, temporal, WebSockets, all of those things that we like to use on the web, we're going to find them built in to Deno. ## Transcript and code Let's take a look at `fetch` first. This works like you might think.  We're going to take a response. from fetching the JSON placeholder API. Then we're going to take that response and convert it to JSON as a new variable and console.log it. Now, if we take a look at this in the terminal, we'll say deno allow network, so that we can opt into this running that fetch immediately. ```javascript title="main.ts" const response = await fetch("https://snowtooth-hotel-api.fly.dev"); const data = await response.json(); console.log(data); ``` And we're done here. All the data comes back like we would expect. ```shell deno add jsr:@std/streams ```  So let me show you what I mean by this. We're going to keep that fetch. We're going to say if that response body value exists, we're going to create a new variable called transformed stream, and we'll set that equal to response dot body. Thank you. And here we're going to use the function called pipe through. And Pipe through is this method in JavaScript that's going to allow us to take the output of the readable stream and pass it through to modify the stream's data. The first thing we're going to do is decode the byte stream into a text stream. So we'll say new text, decoder stream. Then we'll chain on another one of these functions pipeThrough. So this time we're going to split the text stream into lines. So we'll have different lines coming back from our data. Now the text line stream is actually coming from a library that we need to include. ```javascript import { TextLineStream } from "@std/streams"; import { toTransformStream } from "@std/streams/to-transform-stream"; const response = await fetch("https://example.com/data.txt"); // Ensure the response body exists if (response.body) { // Create a stream reader that processes the response body line by line const transformedStream = response.body // Decode the byte stream into a text stream .pipeThrough(new TextDecoderStream()) // Split the text stream into lines .pipeThrough(new TextLineStream()) // Get a reader to read the lines //.getReader(); .pipeThrough(toTransformStream(async function* (src) { for await (const chunk of src) { if (chunk.trim().length === 0) { continue; } console.log(chunk); yield chunk; } })); // Create a reader to consume the transformed stream const reader = transformedStream.getReader(); // Read and log each line of text from the stream while (true) { const { value, done } = await reader.read(); if (done) break; console.log(value); // Log each parsed JSON object } } ``` ## Setting Up Configuration So we're going to say `deno add jsr@std/streams`. That will create our `deno.json` configuration file over here. There will be another video to dig into this in a little more depth, but just know for now that this is including any imports that are part of our project. So the transform stream is coming together, but there's a few more steps. ## Using the Transform Stream The next step is we use pipeThrough again. Now this time we're going to use another function to transform stream, and this is going to come from standard streams and specifically the function `toTransformStream`. Now this time we're going to pass in here an asynchronous generator. We know that it's a generator because we use that asterisk there and the body of this function is a loop, and here we're going to say const chunk, so the little blob of data that we're dealing with, chunk of source, which is the value that's passed in there. We're going to say `console.log(chunk)`, and we're also going to yield the chunk here. Okay, so what is this `console.log` doing for us? Let's go ahead and run `deno --allow-net main.ts`. This is showing us that this is the top line of our HTML document. So we actually need a way to iterate through this, and we're going to do this by creating a reader to consume this transformed stream. So let's get rid of our console log here. Here we're going to create a value called reader that's going to be set equal to `transformedStream.getReader()`. Now from here, what we can do is create a little while loop here. So while that value is true. We want to destructure `{value, done}` from `await reader.read()`. So again, we can call the `.read()` method on that reader. Then we're going to say if `done` is true, then we want to break out of the loop. Otherwise, we want to `console.log(value)`. Nice. So now we're going to see our HTML here printed line by line in our console.  All right, so that is a quick example of using our text line stream. We can use it in combination with fetch. And if you want to learn more about this API, you can check out the documentation here. Deno offers us a truly browser-like environment for using things like fetch, Web Workers, and much, much more. Deno has made it really smooth to use these web-standard APIs in a way that feels familiar and friendly. --- # Build an API server with TypeScript > A guide to creating a RESTful API server using Hono and TypeScript in Deno. Watch how to implement CRUD operations, handle routing, manage data persistence, and build a production-ready backend service. URL: https://docs.deno.com/examples/videos/build_api_server_ts ## Video description Use the light-weight Hono framework (spiritual successor to Express) to build a RESTful API server that supports CRUD operations with a database. ## Transcript and code If you’ve worked on a Node project in the past, you might have used Express to set up a web server or to host an API. Let’s take a look at how we might do something similar by using Hono, a small simple framework that we can use with any runtime, but we’re going to use it with Deno. Basic Hono Setup We’ll add Hono to our project with [JSR](https://jsr.io): ```shell deno add jsr:@hono/hono ``` That will then be added to the deno.json file. Then in our main file, we’ll create the basic Hono setup. ```ts import { Hono } from "@hono/hono"; const app = new Hono(); app.get("/", (c) => { return c.text("Hello from the Trees!"); }); Deno.serve(app.fetch); ``` Let’s run that `deno run --allow-net main.ts` and we’ll see it in the browser at `localhost:8000`. ## CRUD Operations Now that we’ve set up the simple server with Hono, we can start to build out our database. We’re going to use localStorage for this, but keep in mind that you can use any persistent data storage with Deno - postgres, sql - Wherever you like to store your data. Let’s start by creating a container for some data. We’ll start with an interface that describes a tree type: ```ts interface Tree { id: string; species: string; age: number; location: string; } ``` Then we’ll create some data: ```ts const oak: Tree = { id: "3", species: "oak", age: 3, location: "Jim's Park", }; ``` Then we’re going to create a few helper functions that will help us interact with localStorage: ```ts const setItem = (key: string, value: Tree) => { localStorage.setItem(key, JSON.stringify(value)); }; const getItem = (key: string): Tree | null => { const item = localStorage.getItem(key); return item ? JSON.parse(item) : null; }; ``` Now let’s use them: ```ts setItem(`trees_${oak.id}`, oak); const newTree = getItem(`trees_${oak.id}`); console.log(newTree); ``` ```shell deno --allow-net main.ts ``` - `setItem` is adding the tree - You can also use `setItem` to update the record -- if the key already exists the value will be updated ```ts const oak: Tree = { id: "3", species: "oak", age: 4, location: "Jim's Park", }; localStorage.setItem(`trees_${oak.id}`, JSON.stringify(oak)); ``` Ok, so now let’s use Hono’s routing to create some REST API routes now that we understand how to work with these database methods: ```ts app.post("/trees", async (c) => { const { id, species, age, location } = await c.req.json(); const tree: Tree = { id, species, age, location }; setItem(`trees_${id}`, tree); return c.json({ message: `We just added a ${species} tree!`, }); }); ``` To test this out we’ll send a curl request: ```shell curl -X POST http://localhost:8000/trees \ -H "Content-Type: application/json" \ -d '{"id": "2", "species": "Willow", "age": 100, "location": "Juniper Park"}' ``` To prove that we created that tree, let’s get the data by its ID: ```ts app.get("/trees/:id", async (c) => { const id = c.req.param("id"); const tree = await kv.get(["trees", id]); if (!tree.value) { return c.json({ message: "Tree not found" }, 404); } return c.json(tree.value); }); ``` To test that, let’s run a curl request for the data ```shell curl http://localhost:8000/trees/1 ``` Or you can go to it in the browser: `http://localhost:8000/trees/1` We can update a tree of course. Kind of like before but we’ll create a route for that: ```ts app.put("/trees/:id", (c) => { const id = c.req.param("id"); const { species, age, location } = c.req.json(); const updatedTree: Tree = { id, species, age, location }; setItem(`trees_${id}`, updatedTree); return c.json({ message: `Tree has relocated to ${location}!`, }); }); ``` And we’ll change the location because we’re going to PUT this tree somewhere else: ```shell curl -X PUT http://localhost:8000/trees/1 \ -H "Content-Type: application/json" \ -d '{"species": "Oak", "age": 8, "location": "Theft Park"}' ``` Finally if we wanted to delete a tree we can using the Hono delete function. ```ts const deleteItem = (key: string) => { localStorage.removeItem(key); }; app.delete("/trees/:id", (c) => { const id = c.req.param("id"); deleteItem(`trees_${id}`); return c.json({ message: `Tree ${id} has been cut down!`, }); }); ``` We’ve used Deno in combination with Hono to build a little REST API for our tree data. If we wanted to deploy this, we could and we could deploy with zero config to [Deno deploy](https://deno.com/deploy). You can deploy this to any cloud VPS like AWS, GCP, Digital Ocean, with the [official Docker image](https://github.com/denoland/deno_docker) ## Complete code sample ```ts import { Hono } from "@hono/hono"; const app = new Hono(); interface Tree { id: string; species: string; age: number; location: string; } const setItem = (key: string, value: Tree) => { localStorage.setItem(key, JSON.stringify(value)); }; const getItem = (key: string): Tree | null => { const item = localStorage.getItem(key); return item ? JSON.parse(item) : null; }; const deleteItem = (key: string) => { localStorage.removeItem(key); }; const oak: Tree = { id: "3", species: "oak", age: 3, location: "Jim's Park", }; setItem(`trees_${oak.id}`, oak); const newTree = getItem(`trees_${oak.id}`); console.log(newTree); app.get("/", (c) => { return c.text("Hello from the Trees!"); }); app.post("/trees", async (c) => { const { id, species, age, location } = await c.req.json(); const tree: Tree = { id, species, age, location }; setItem(`trees_${id}`, tree); return c.json({ message: `We just added a ${species} tree!`, }); }); app.get("/trees/:id", async (c) => { const id = await c.req.param("id"); const tree = getItem(`trees_${id}`); if (!tree) { return c.json({ message: "Tree not found" }, 404); } return c.json(tree); }); app.put("/trees/:id", async (c) => { const id = c.req.param("id"); const { species, age, location } = await c.req.json(); const updatedTree: Tree = { id, species, age, location }; setItem(`trees_${id}`, updatedTree); return c.json({ message: `Tree has relocated to ${location}!`, }); }); app.delete("/trees/:id", (c) => { const id = c.req.param("id"); deleteItem(`trees_${id}`); return c.json({ message: `Tree ${id} has been cut down!`, }); }); Deno.serve(app.fetch); ``` --- # Build a Command Line Utility URL: https://docs.deno.com/examples/videos/command_line_utility ## Video description Learn to build a command line tool using Deno's standard library. You'll explore how to parse arguments, handle flags, and provide helpful messages using utility functions. Follow along as we build a ski resort information app, handle errors gracefully, and compile the script into an executable for multiple platforms, including Windows, MacOS, and Linux. By the end of this video, you'll understand how to take full advantage of Deno's features to develop and distribute your own CLI tools. ## Transcript and code ### An introduction to Deno's Standard Library If you want to create a command line tool you can do so with [Deno's standard Library](https://docs.deno.com/runtime/fundamentals/standard_library/). It contains dozens of stable libraries with helpful utility functions that can cover a lot of the basics when working with JavaScript in the web. The standard Library also works in multiple runtimes and environments like Node.js and the browser. ### Setting up a command line tool We're going to create a commandline tool, and then we're going to compile it so it can be used on a number of different platforms as an executable. Create a new file called `main.ts` and parse these arguments (remember we can always grab them from `Deno.args`), and then we'll console log them: ```typescript title="main.ts" const location = Deno.args[0]; console.log(`Welcome to ${location}`); ``` Now if I run `deno main.ts` and then I provide the name of a ski resort like Aspen that's going to plug that into the string, eg: ```sh deno main.ts Aspen ## Welcome to Aspen ``` ### Installing and Using Standard Libraries Now lets install one of those standard libraries. In the terminal run: ```sh deno add jsr:@std/cli ``` This is going to install the [cli library](https://jsr.io/@std/cli), from the Deno standard library, into our project so we could make use of some of their helpful functions. The Helpful function that we'll use here is called `parseArgs`. We can import that with: ```typescript import { parseArgs } from "jsr:@std/cli/parse-args"; ``` Then we can update our code to use this function, passing the argument and removing the zero. Our `main.ts` file now looks like this: ```typescript title="main.ts" import { parseArgs } from "jsr:@std/cli/parse-args"; const args = parseArgs(Deno.args); console.log(args); ``` Let's go ahead and try this out, in your terminal run: ```sh deno main.ts -h Hello ``` We can see that `Hello` has been added to our args object. All right, so that's working as expected. ### Building the Ski Resort Information App Now our app is going to be a ski resort information app, so we want to populate our app with a little bit of data to start. We're going to create a value called `resorts`. This is an object with a few different keys so we'll say `elevation`, `snow` and `expectedSnowfall`. Then let's just copy and paste these so that we can move a little more quickly we'll set `Aspen` to `7945` `snow` to `packed powder`, `expectedSnowfall` to `15`. Then let's add one more of these we'll set `Vail` to `8120` and then we'll say `expectedSnowfall` is `25`. ```typescript title="main.ts" const resorts = { Whistler: { elevation: 2214, snow: "Powder", expectedSnowfall: "20", }, Aspen: { elevation: 7945, snow: "packed powder", expectedSnowfall: 15, }, Vail: { elevation: 8120, snow: "packed powder", expectedSnowfall: 25, }, }; ``` We have a few different resorts here. Ultimately we want to be able to run our app with a command line argument that's going to provide the resort name and then have that CLI tool return the information about that resort. ### Handling Command Line Arguments So let's go ahead and pass another object to parse args, here we're going to define an alias - so we're going to say "if I pass the `r` flag we want to have it assume it means `resort`. Then let's also use the default here, we'll set the `default` `resort` to `Whistler`: ```typescript title="main.ts" const args = parseArgs(Deno.args, { alias: { resort: "r", }, default: { resort: "Whistler", }, }); ``` From here we can set up a const called `resortName` and set it to `args.resort`. Then get the resort, with `resorts[resortName]` (we'll fix that type error in a second), and update the console log: ```typescript title="main.ts" const resortName = args.resort; const resort = resorts[resortName]; console.log( `Resort: ${resortName} Elevation: ${resort.elevation} feet Snow: ${resort.snow} Expected Snowfall: ${resort.expectedSnowfall}`, ); ``` To test this out we can use: ```sh deno main.ts -r Aspen ``` Which will give us a printout of all of Aspen's details. We can also run this without any arguments which should give the details for Whistler, because that was set as default: ```sh deno main.ts ``` Same goes for our full name, so we could say: ```sh deno main.ts --resort Veil ``` And that should give us those details as well. ### Improving Error Handling Now if I tried to run this with a resort that's not there, let's say `Bachelor`; there's an error so that's kind of an ugly one. It's hitting this moment where it's trying to parse that out and it can't find it. So we could make this a little nicer by saying if there's no `resort` in our data set that matches the input, let's run a console error saying `resort name not found, try Whistler Aspen or Veil` and then we'll hop out of that process with a `Deno.exit`: ```typescript title="main.ts" if (!resort) { console.error( `Resort ${resortName} name not found. Try Whistler, Aspen, or Veil`, ); Deno.exit(1); } ``` ### Fixing the types Okay so this here isn't looking so good we can look at the problems here in typescript - it's telling us that this implicitly has an `any` type, you can look up more about this error but I'll show you how to fix this one. Update the type of `resortName` to be a key of `resorts`: ```typescript title="main.ts" const resortName = args.resort as keyof typeof resorts; ``` What this has done is extract the value of `args.resort` and it's going to assert that there is a valid key inside of the data. ### Adding Help and Color Output Let's take this one more step, we're going to say if `args.help`, we will console log and then we're going to give our users a little message to say "hey this is actually how you use this" if they do happen to ask for help at any moment, and we'll update the alias here to say `help` is `H`, finally we'll make sure to call `Deno.exit` so that we jump out of the process as soon as we're done with that: ```typescript title="main.ts" const args = parseArgs(Deno.args, { alias: { resort: "r", help: "h", }, default: { resort: "Whistler", }, }); ... if (args.help) { console.log(` usage: ski-cli --resort -h, --help Show Help -r, --resort Name of the ski resort (default: Whistler) `); Deno.exit(); } ``` You can test your help setup by running the following: ```sh deno main.ts -h ``` Next let's log our results here in color. Deno has support for CSS using the `%C` syntax. This will take the text and apply the style that we pass in as the second argument to the `console.log()`. Here we could set `color:blue` as the second argument, eg: ````typescript title="main.ts" console.log(` %c Resort: ${resortName} Elevation: ${resort.elevation} feet Snow: ${resort.snow} Expected Snowfall: ${resort.expectedSnowfall} `, "color:blue" ); Then run the program again: ```sh deno main.ts -r Veil ```` You should see everything logged in a blue color. How cool is that?! ### Compiling the Tool for Different Platforms I want other people to be able to enjoy the app too. Compiling this tool into an executable is pretty easy with Deno. As you might imagine, the command for running this is `deno compile` and then the name of our script. This is going to compile the code to the project as an executable: ```sh deno compile main.ts ``` You should see the executable in your project folder called MyDenoProject. Now you can run this as an executable with `./`, eg: ```sh ./MyDenoProject --resort Aspen ``` So this is really great for me, but what happens if I want to share this to other platforms? All you would need to do is run `deno compile` again, this time passing in a `--target` flag for where you want to compile to. Let's say we wanted to compile it for Windows we'd use: ```sh deno compile --target x86_64-pc-windows-msvc --output ski-cli-windows main.ts ``` or for a Mac: ```sh deno compile --target x86_64-apple-darwin --output ski-cli-macos main.ts ``` or for Linux: ```sh deno compile --target x86_64-unknown-linux-gnu --output ski-cli-linux main.ts ``` You can see all of the [options for compiling your apps](/runtime/reference/cli/compile/) in the Deno documentation. There are a lot of different flags that you can use for your own specific use cases. To recap we always have access to the Deno Standard Library that we can take advantage of with all these different helpful functions. If we wanted to create a command line utility, like we've done here, we always have access to the [`Deno` global namespace](/api/deno/~/Deno) for these arguments. We can parse the arguments using the parse args function from the standard Library CLI package and we can run a compile for all platforms so that our app can be consumed anywhere. --- # Configuration with Deno JSON URL: https://docs.deno.com/examples/videos/configuration_with_deno_json ## Video description In this video, we use the deno.json file to manage dependencies and configurations in your Deno projects. Learn how to create and configure tasks like 'start' and 'format' to streamline your workflow. We'll also explore customizing formatting and linting rules, and understand the concept of import maps for cleaner imports. Then we'll take a look at compatibility between Deno's deno.json and Node's package.json for seamless project integration. ## Transcript and code ### Introduction to JSR Package Management Every time we’ve installed a package with JSR it’s been placed into this `deno.json` file as an import. ```json title="deno.json" { "imports": { "@eveporcello/sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` ### Creating and Running Tasks So, we can use this file to manage our dependencies, but we can also use it for a bunch of other configuration tasks. Specifically, to get us started, let’s configure some literal tasks. We’re going to create a `"start"` task. This will run `deno --allow-net main.ts`. ```json title="deno.json" { "tasks": { "start": "deno --allow-net main.ts" }, "imports": { "@eveporcello/sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` So, think of this like a shortcut for running a command. So we could say ```sh deno task start ``` This is going to run that, same with ```sh deno run start ``` that will work as well. Let’s add another one of these, we’re going to call it `"format"`. So, this will combine these two different things, we’ll say `deno fmt && deno lint`. ```json title="deno.json" { "tasks": { "start": "deno --allow-net main.ts", "format": "deno fmt && deno lint" }, "imports": { "@eveporcello/sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` So let’s run ```sh deno task format ``` and then this will run everything for us. ### Formatting and Linting Configuration You can also use this file to set configurations for these types of commands. So we can say `"fmt"` and then use a couple different rules, so the Formatting in the documentation [here](/runtime/fundamentals/configuration/#formatting) will walk you through it. There’s several different options that you can take advantage of, let’s go ahead and say, `"useTabs"`, and we’ll say `true` here, and then we’ll use `”lineWidth”: 80`. ```json title="deno.json" { "tasks": { "start": "deno --allow-net main.ts", "format": "deno fmt && deno lint" }, "fmt": { "useTabs": true, "lineWidth": 80 }, "imports": { "@eveporcello/sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` Now if we run ```sh deno task format ``` This will run everything with those rules. Linting, you could set up as well. So we’ll say `"lint"`. This is also in the documentation, right above this, so Linting [here](/runtime/fundamentals/configuration/#linting) will take you on the journey of all the different configuration options depending on your project’s needs, but in this case let’s add a key for `"rules"` here, and you can include them, you can exclude them. ```json title="deno.json" { "tasks": { "start": "deno --allow-net main.ts", "format": "deno fmt && deno lint" }, "lint": { "rules": {} }, "fmt": { "useTabs": true, "lineWidth": 80 }, "imports": { "@eveporcello/sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` Let’s say `// @ts-ignore`, and we won’t add any comments after it. ```typescript title="main.ts" // @ts-ignore import { sing } from "jsr:@eveporcello/sing"; console.log(sing("sun", 3)); ``` This is a rule that’s going to, if I add this to the top of any file, the intended behavior in a project, it’s going to make sure that Typescript just ignores any of the types that are in this file, so it doesn’t matter if it adheres to the rules. But, if I run ```sh deno task format ``` again, this is going to tell me, “Hey, you can’t do that. You can’t ignore these files without comment.” This is one of those rules. But, we know where to find a way out of that trap, which, maybe you don’t want to find a way out, but I’ll show you how anyway. We’ll say `”exclude”: [“ban-ts-comment”]`. ```json title="deno.json" { "tasks": { "start": "deno --allow-net main.ts", "format": "deno fmt && deno lint" }, "lint": { "rules": { "exclude": ["ban-ts-comment"] } }, "fmt": { "useTabs": true, "lineWidth": 80 }, "imports": { "@eveporcello/sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` Then, we’ll try to run ```sh deno task format ``` again. We should see that that runs appropriately and we’re getting away with our `// @ts-ignore`. ### Handling Import Maps There’s also a concept in this `deno.json` file of the import map. So, right now we’re using `"@eveporcello/sing"` as the import, but it’s also possible to make this a little bit shorter. We could use just `"sing"` for this. ```json title="deno.json" { "tasks": { "start": "deno --allow-net main.ts", "format": "deno fmt && deno lint" }, "lint": { "rules": { "exclude": ["ban-ts-comment"] } }, "fmt": { "useTabs": true, "lineWidth": 80 }, "imports": { "sing": "jsr:@eveporcello/sing@^0.1.0" } } ``` Now if we replace this whole thing with just `"sing"` ```typescript title="main.ts" // @ts-ignore import { sing } from "sing"; console.log(sing("sun", 3)); ``` and we run ```sh deno main.ts ``` This should work as expected. So this is what’s called a “bare specifier”. It’s an import map which is going to map this particular dependency to this JSR package, so it just allows for a nice, clean import if we’d like to. If you want to learn more about these different options, check out the docs [here](/runtime/fundamentals/configuration/) on configuration. Deno also supports a `package.json` for compatibility with Node.js projects. Now, if both a `deno.json` and a `package.json` are both found in the same directory, Deno will understand the dependencies specified in both. So, a lot of options here, but this is going to be extremely useful as you work on your Deno projects. --- # Benchmarking with Deno bench > Learn how to measure code performance using Deno's built-in benchmarking tool. Discover baseline comparisons, grouped benchmarks, and precise measurement techniques for optimizing your TypeScript and JavaScript code. URL: https://docs.deno.com/examples/videos/deno_bench ## Video description [`deno bench`](/runtime/reference/cli/bench/) is an easy to use benchmarking tool that ships with the Deno runtime. Here are 3 ways that can level up how you use deno bench. ## Transcript and code What's up everyone it's Andy from Deno and today we're going to talk about `deno bench`. This video is a continuation of our **Deno tool chain** series. `deno bench` is a benchmarking tool that makes it easy to measure performance, and if you're coming from Node, `deno bench` saves you time from finding and integrating a third party benchmarking tool. ### Baseline Summaries Today we're going to cover some cool use cases with `deno bench`. Most of the time we'll want to benchmark two or more ways of doing the same thing, here we're comparing parsing a URL from a string, parsing a URL with a path, and then also parsing a URL with a path and a URL object: ```typescript title="main_bench.ts" Deno.bench("url parsing", () => { new URL("https://deno.land"); }); Deno.bench("url parsing with path", () => { new URL("./welcome.ts", "https://deno.land/"); }); const BASE_URL = new URL("https://deno.land"); Deno.bench("url parsing with a path and a URL object", () => { new URL("./welcome.ts", BASE_URL); }); ``` Then run: ```sh deno bench main_bench.ts ``` The output results show how long it takes each benchmark in nano seconds, as well as how many iterations per second. Not only that but also includes the CPU chip and the runtime. The results indicate that the first approach is the fastest. But what if you want a more clear way to show exactly how much faster it is? We can pass the `baseline:true` option into the Benchmark: ```typescript title="main_bench.ts" Deno.bench("url parsing", { baseline: true }, () => { new URL("https://deno.land"); }); ...etc ``` When we run it there is now a summary section at the bottom of the output that shows you exactly how much faster the benchmarks are compared to the baseline. If you want multiple benchmarks but in the same file you can organize the output using the `group` option. If we add a fourth Benchmark for splitting text and run the file we'll see all of the results grouped together, which isn't very helpful. Instead we can add a group of `url` to the URL benchmarks and a group of `text` to the text benchmarks: ```typescript title="main_bench.ts" Deno.bench("url parsing", { baseline: true, group: "url" }, () => { new URL("https://deno.land"); }); ...etc const TEXT = "Lorem ipsum dolor sit amet"; Deno.bench("split on whitespace", { group: "text" }, () => { TEXT.split(" "); }); ``` Now you will see our results are organized by group. ### More specific benchmarking with `b.start()` and `b.end()` Did you know that you can be specific about when to start and stop measuring your benchmarks? Here's a new Benchmark file where we plan to benchmark parsing the first word of the releases markdown file, which is all the release notes from the Deno runtime project over the past 5 years. It's over 6,000 lines long! ```typescript title="file_bench.ts" const FILENAME = "./Releases.md"; Deno.bench("get first word", () => { const file = Deno.readTextFileSync(FILENAME); const firstWord = file.split(" ")[0]; }); ``` Running `deno bench` shows that this operation takes a long time, but it's mostly because the benchmark requires reading the file in memory. So how do we benchmark reading just the first word? If we use the `bench` `context parameter`, we have access to the `start()` and `end()` functions. ```typescript title="file_bench.ts" const FILENAME = "./Releases.md"; Deno.bench("get first word", (b) => { b.start(); const file = Deno.readTextFileSync(FILENAME); const firstWord = file.split(" ")[0]; b.end(); }); ``` Now when we run `deno bench`, you'll notice that this benchmark only measures the reading of the first word. This was just a glimpse into `deno bench`, if you want to check out the other options on Deno Bench check out the other options available to you, you can use your editor to `ctrl+click` through to the bench definitions, or look at the [`deno bench` documentation](/runtime/reference/cli/bench/). There are some other options that you can pass such as [`only`](/runtime/reference/cli/bench/#bench-definition-filtering) and [`ignore`](/runtime/reference/cli/bench/#options-ignore). --- # Deno coverage > Learn how to measure test coverage in Deno projects. Watch how to generate coverage reports, analyze code coverage metrics, and use the HTML report feature. URL: https://docs.deno.com/examples/videos/deno_coverage ## Description of video We updated `deno coverage` in 1.39 with a better output and HTML generation. ## Transcript and code If you're using `deno test`, have you checked out `deno coverage`? `deno coverage` is a great way to see how much test coverage you have, just add the coverage flag to Deno test: ```sh deno test --coverage ``` This will save coverage data to `/coverage`. Then run the coverage command: ```sh deno coverage ./coverage ``` to see a coverage report. In Deno 1.39, `deno coverage` was updated in two ways; first it now outputs a concise summary table and second, if you add the `--html` flag: ```sh deno coverage ./coverage --html ``` the coverage tool generates static HTML so that you can explore your coverage in a browser. We got more plans for Deno coverage, like simplifying the steps into a single command and more. --- # Your Deno Dev Environment > Learn how to set up your Deno development environment. Watch how to install Deno, configure VS Code, enable type checking and autocomplete, and optimize your TypeScript development workflow. URL: https://docs.deno.com/examples/videos/deno_dev_environment ## Video description How to set up your development environment for Deno ## Transcript and code To install Deno, we'll run curl. So we're going to grab this curl command [from the documentation](https://docs.deno.com/runtime/getting_started/installation/). ```shell curl -fsSL https://deno.land/install.sh | sh ``` We'll go to our terminal, we'll paste that in, hit enter, and this will install Deno in the background to the most recent version. When I do this, it'll ask me if I want to add Deno to the path. We'll go ahead and say yes, and you can add these setup completions here. And now we have installed this to our path. If you're on Windows, there are installation instructions for you here in the documentation. To generate a Deno project from scratch, let's go ahead and type `deno init MyDenoProject`. This is going to create that folder for me. I can then cd into that folder. Now if we open this up in VSCode, this has created a `deno.json` file, a `main_test.ts` file, and a `main.ts` file. So this is a quick way of getting started. If you're using VSCode, there are a few configuration options that you'll want to set up. So we'll go up here to code and settings. We'll select extensions. So over here in your extensions, you're going to search for Deno, and then we'll [select the one that has been created by Denoland here](https://marketplace.visualstudio.com/items?itemName=denoland.vscode-deno). ```javascript { "deno.enable": true, } ``` We're going to run install, and this will install our Deno land extension. Next we'll type `command shift P`. This will open up our command palette here, and we can type `deno initialize workspace configuration`. We're going to go ahead and click that. That's going to generate this VSCode folder with settings. This is going to enable hints and autocomplete and all of that right here in the code editor. So if I start to type anything from `deno serve`, for example, that's going to give me a look at what the expected parameters of that function are. That's very helpful. This is also going to give us hints when importing. So we'll say import star as path from JSR at standard slash path. ```javascript import * as path from "jsr:@std/path"; ``` So all of them are listed there. Pretty cool. And then if we wanted to do something for a remote module, something like OpenAI from [https://deno.land/x/openai@v4.67.1/mod.ts](https://deno.land/x/openai@v4.67.1/mod.ts) (or now, even better, from [JSR](https://jsr.io/@openai/openai)) ```javascript import OpenAI from "jsr:@openai/openai"; ``` This is then going to give us the standard library as well as X for all of those third party APIs. So you can actually drill down into OpenAI from here. You just need to select the version, so we'll say OpenAI at v461. And then you can even drill down into that individual file. If you take a look at [the documentation here, this will guide you through the process of setting up your own unique environment](/runtime/getting_started/setup_your_environment/). There are [shell completions](/runtime/getting_started/setup_your_environment/#shell-completions) that you can add, so depending on which CLI tool you're using, you can set this up over here, whether it's Bash or PowerShell or ZShell or whatever it might be. --- # Formatting with Deno fmt URL: https://docs.deno.com/examples/videos/deno_fmt ## Video description A quick cut of tips and tricks on [Deno's built in formatter, `deno fmt`](/runtime/reference/cli/fmt/). what's up everyone, Andy from Deno here, back for another episode of the **Deno tool chain series** where we dig a little deeper into the deno subcommands. Today we're going to look at `deno fmt`, our built-in formatter that's customizable, performant and flexible enough to fit into any workflow. Let's dive right in. ### What is `deno fmt`? `deno fmt` will format these file extensions: - `.js` - `.jsx` - `.ts` - `.tsx` - `.json` - `.jsonc` - `.md` - `.markdown` The simplest way to use `deno fmt` is to run it from the command line: ```sh deno fmt ``` You could even pipe in a string or file: ```sh echo ' console.log( 5 );' | deno fmt - ## console.log(5); ``` You can also use the `--check` flag which will check if your code has been formatted by `deno fmt`. If it's not formatted, it will return a nonzero exit code: ```sh echo ' console.log( 5 );' | deno fmt --check - ## Not formatted stdin ``` This is useful in CI where you want to check if the code is formatted properly. ### Editor integration `deno fmt` also works in your editor, like VS Code. Set `deno fmt` as your default formatter in your editors settings, eg for VS Code: ```json title=".vscode/settings.json" { "editor.defaultFormatter": "denoland.vscode-deno", "editor.formatOnSave": true } ``` You can also set format on save to be true ### Multiple ways to format In some situations, there are multiple ways to format, and Deno lets you decide how you want to format. For example an object can be formatted horizontally or vertically, it depends on where you put your first item. Eg: ```typescript const foo = { bar: "baz", qux: "quux" }; // or const foo = { bar: "baz", qux: "quux", }; ``` Same with an array. You can format it horizontally or vertically depending on where you put your first item. Eg: ```typescript const foo = ["bar", "baz", "qux"]; // or const foo = [ "bar", "baz", "qux", ]; ``` ### Remove escaped quotes `deno fmt` can also reduce the escaped characters in your strings. For example, if you have a string with escaped quotes, `deno fmt` will remove them: ```typescript console.log("hello \"world\""); ``` will be formatted to: ```typescript console.log('hello "world"'); ``` ### Ignoring lines or files What if you want `deno fmt` to skip a line or a file? You can use the `//deno-fmt-ignore` comment to tell `deno fmt` to skip the following line, eg: ```typescript console.log("This line will be formatted"); // deno-fmt-ignore console.log("This line will not be formatted"); ``` To tell `deno fmt` to skip a file, you can use the `// deno-fmt-ignore-file` comment at the top of the file to ignore. Or you can use your `deno.json` config file under the `fmt` field: ```json { "fmt": { "exclude": ["main.ts", "*.json"] } } ``` ### Formatting markdown `deno fmt` also works on markdown files. You can choose how to format prose with the option `"proseWrap"` set to either `always`, `never` or `preserve`, eg: ```json { "fmt": { "proseWrap": "always" } } ``` `deno fmt` can also format numbered lists if you start a number list with two ones, for example: ```markdown title="list.md" 1. First 1. Second 1. Third 1. Fourth 1. Fifth ``` The formatter will automatically format the list to all ones, but when you render it, it will show the number list properly! If that's weird you can also put `1` and then `2` and then run `deno fmt`, which will number the rest of the list correctly for you. `deno fmt` will also format code blocks of JavaScript and TypeScript in your markdown. It can even format markdown in markdown! ### Formatter options Let's take a look at [all the options available in `deno fmt`](/runtime/reference/cli/fmt/#formatting-options). Note that all these options also have a corresponding flags in the CLI. ```json { "fmt": { "useTabs": true, "lineWidth": 80, "indentWidth": 2, "semiColon": false, "singleQuote": true, "proseWrap": "always", "exclude": ["**/logs.json"] } } ``` - `--use-tabs` - `--line-width ` - `--indent-width ` - `--no-semicolons` - `--single-quote` - `--prose-wrap ` - `--ignore=` ### `deno fmt`'s Performance `deno fmt` is really fast, especially on subsequent runs due to caching, which is enabled by default. Here's the first run that we did on Deno's standard Library. Let's run it again! The system time shows that the second run is a third faster. If we update a file and run it again it's still fast since `deno fmt` checks only the changed file. Let's compare this to `Prettier` (a popular Node formatter), we'll run Prettier with a caching flag enabled. Even on a second run, `deno fmt` is almost 20 times faster! --- # Getting started with Deno test URL: https://docs.deno.com/examples/videos/deno_test --- # Deploy Deno to AWS Lambda URL: https://docs.deno.com/examples/videos/deploy_deno_to_aws_lambda ## Video description Show how to deploy Deno applications to AWS Lambda (using a community runtime for Lambda). ## Transcript and code ### Run Deno on AWS Lambda Running Deno on AWS Lambda? Sure, you can do that. With AWS lambda the serverless pricing can be cheaper than a VPS and can be easier to maintain because it can auto scale behind the scenes. To make that work, we’re going to use the aws-lambda-adapter project to make sure that our `Deno.serve` function runs as we expect it to. This is a popular approach to deploying to AWS lambda due to control, flexibility, and consistency. There’s a nice article on this on the blog if you want to learn more about these considerations. Let’s take a look at the Dockerfile that we can use to make this work: ```dockerfile # Set up the base image FROM public.ecr.aws/awsguru/aws-lambda-adapter:0.9.0 AS aws-lambda-adapter FROM denoland/deno:bin-2.0.2 AS deno_bin FROM debian:bookworm-20230703-slim AS deno_runtime COPY --from=aws-lambda-adapter /lambda-adapter /opt/extensions/lambda-adapter COPY --from=deno_bin /deno /usr/local/bin/deno ENV PORT=8000 EXPOSE 8000 RUN mkdir /var/deno_dir ENV DENO_DIR=/var/deno_dir # Copy the function code WORKDIR "/var/task" COPY . /var/task # Warmup caches RUN timeout 10s deno -A main.ts || [ $? -eq 124 ] || exit 1 CMD ["deno", "-A", "main.ts"] ``` Then we’ll build the Docker image. ```shell docker build -t my-deno-project . ``` Now we need to start interfacing with AWS. If this is your first time working with AWS, you can create an account: [https://aws.amazon.com](https://aws.amazon.com) And if you haven’t installed the AWS CLI, you can do that too. You know if it’s installed by typing `aws` into your Terminal or Command Prompt. If that returns an error you can install with homebrew or follow the instructions through the website: [https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html) ``` brew install awscli ``` Then you’ll want to make sure that you’re set up with `aws configure`. Everything that it is looking for is in the [Security Credentials section of the AWS Console](https://us-east-1.console.aws.amazon.com/ecr/private-registry/repositories). ### Use the CLI to create an ECR The ECR is a registry service where we can push our docker container ``` aws ecr create-repository --repository-name my-deno-project --region us-east-1 | grep repositoryUri ``` This outputs a URI for the repo: \`"repositoryUri": "\<\\>[.dkr.ecr.us-west-1.amazonaws.com/my-deno-project](http://.dkr.ecr.us-west-1.amazonaws.com/my-deno-project)",\` Then log in using the URI that comes back ```shell aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin .dkr.ecr.us-east-1.amazonaws.com/my-deno-project ``` Tag the image ```shell docker tag my-deno-project:latest .dkr.ecr.us-east-1.amazonaws.com/my-deno-project:latest ``` Then Push the image to ECR ```shell docker push .dkr.ecr.us-west-1.amazonaws.com/my-deno-project:latest ``` Now we need to create a function that will host our app: - [https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1\#/begin](https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1#/begin) - Think of a function as being a place where the app is going to run - Select Create a Function - Select Container Image Radio Button - Call the function `tree-app` - Select the app from the Browse Containers button - Halfway down the page select “Configuration” - Select `Function URL` - Create a URL - Select None so the endpoint is public - Select Save - Check the app in the browser One thing to keep in mind with Lambda functions is cold start performance. Cold starts happen when AWS needs to initialize your function, and it can cause slight delays. There’s a pretty cool [blog here that goes through Deno vs. other tools](https://deno.com/blog/aws-lambda-coldstart-benchmarks). Using Deno with AWS Lambda functions is a great way to stand up your app quickly in a familiar environment. --- # Deploying Deno with Docker URL: https://docs.deno.com/examples/videos/deploying_deno_with_docker ## Video description See how to deploy Deno applications with Docker to a compatible cloud environment. ## Resources - https://github.com/denoland/deno_docker - https://fly.io/ - https://docs.deno.com/runtime/reference/docker/ ## Transcript and code Deno has made a lot of things seem easy: linting, formatting, interoperability with the Node ecosystem, testing, TypeScript, but how about deployment? How easy is it to get Deno running in production? Pretty easy! Let’s start with a look at our app. It’s an app that provides us with some information about trees. On the homepage we get some text At the trees route, we get some JSON At the dynamic route based on the tree’s id, we get information about that single tree. ```ts import { Hono } from "jsr:@hono/hono"; const app = new Hono(); interface Tree { id: string; species: string; age: number; location: string; } const oak: Tree = { id: "1", species: "oak", age: 3, location: "Jim's Park", }; const maple: Tree = { id: "2", species: "maple", age: 5, location: "Betty's Garden", }; const trees: Tree[] = [oak, maple]; app.get("/", (c) => { return c.text("🌲 🌳 The Trees Welcome You! 🌲 🌳"); }); app.get("/trees", (c) => { return c.json(trees); }); app.get("/trees/:id", (c) => { const id = c.req.param("id"); const tree = trees.find((tree) => tree.id === id); if (!tree) return c.json({ message: "That tree isn't here!" }, 404); return c.json(tree); }); Deno.serve(app.fetch); ``` ## Run Locally with Docker Make sure that Docker is installed on your machine. In your terminal or command prompt, you can run docker and if you get a big list of commands, you have it. If not, head over to https://www.docker.com/ and download it based on your operating system. ### Test run docker: ```shell docker ``` Then run the command to get running on `localhost:8000` with Docker ```shell docker run -it -p 8000:8000 -v $PWD:/my-deno-project denoland/deno:2.0.2 run --allow-net /my-deno-project/main.ts ``` Visit the app running at `localhost:8000` It’s also possible to run this with a docker config file. ```dockerfile FROM denoland/deno:2.0.2 # The port that your application listens to. EXPOSE 8000 WORKDIR /app # Prefer not to run as root. USER deno # These steps will be re-run upon each file change in your working directory: COPY . . # Compile the main app so that it doesn't need to be compiled each startup/entry. RUN deno cache main.ts # Warmup caches RUN timeout 10s deno -A main.ts || [ $? -eq 124 ] || exit 1 CMD ["run", "--allow-net", "main.ts"] ``` Then build it ```shell docker build -t my-deno-project . ``` From there, you can deploy the app to your hosting provider of choice. I’m going to use fly.io today. ## Deploy to fly.io If you haven’t worked with fly before, it’s a cloud platform that allows you to deploy and run fullstack apps. They run in multiple regions throughout the world which makes them a pretty nice option. https://fly.io/ ### Install Fly Install with curl ```shell curl -L https://fly.io/install.sh | sh ``` ### Log in with Fly via CLI ```shell fly auth login ``` This will open the browser for you to log into your account (or create one if you haven’t already). Then we’ll launch the app with fly using: ```shell flyctl launch ``` This will generate a fly.toml file for the app, and you can choose different settings if you’d like to. And more importantly it will launch it! We’ll just wait for the process to complete, and we should be able to view our app running at that location. So with Deno, we can use Docker to containerize the app and with Fly we can get the app hosted in production in just a few minutes. ## More information on working with Docker For a closer look at Deno's support of Docker, including best practices, running tests with Docker, using workspaces, and more, please take a look at our [Deno and Docker reference documentation](https://docs.deno.com/runtime/reference/docker/). --- # ECMAScript Modules URL: https://docs.deno.com/examples/videos/esmodules --- # Interoperability with Node.js URL: https://docs.deno.com/examples/videos/interoperability_with_nodejs ## Video description Deno gained lots of interpoperability capabilities at its v2.0 release. In this video, we'll look at how to use Node.js built-in APIs, NPM modules, and JSR packages. ## Transcript and examples [Deno 2.0](https://deno.com/blog/v2) is here, and it's good. One of the most amazing features of Deno is its interoperability with other platforms including Node. For example, we can use the core Node.js built in APIs. All we have to do is add this Node specifier here. ```ts import fs from "node:fs/promises"; ``` Deno also supports the use of NPM modules. All you need to do is add the NPM specifier with your import and you're good to go. ```ts import { * } as Sentry from "npm:@sentry/node"; ``` We can also take advantage of [JSR](https://jsr.io), an open source package registry for TypeScript and JavaScript. ```ts import OpenAI from "jsr:@openai/openai"; ``` JSR works with Deno, of course, but also with Node.js. bun, and CloudFlare workers. You can even install JSR packages into Vite and Next.js applications. Deno also gives us [import maps](https://docs.deno.com/runtime/fundamentals/modules/#differentiating-between-imports-or-importmap-in-deno.json-and---import-map-option), which help us manage our dependencies. You can install a package from JSR. The import will be added to the `deno.json`, and you can even use a shorthand to describe this to clean up your code even more. Deno 2.0 is focused on a really solid developer experience. New projects and migrations feel a whole lot easier with Deno. --- # Introduction to Deno APIs URL: https://docs.deno.com/examples/videos/intro_to_deno_apis ## Video description In this video, we explore the powerful APIs provided by Deno in the global namespace. We demonstrate file system operations like creating, reading, writing, and appending to files using Deno's built-in methods. Then, examine how to handle command line arguments, environment variables, and set up a basic server. We can reduce the need for external APIs with these Deno built-in APIs. ## Transcript and examples In the global name space, Deno has a ton of APIs that you can take advantage of. Let's take a look at a few of them. ### Creating and writing to files In order to write a file, first we will await Deno.open and we'll pass in the name of the file that we want to create. The second argument is going to be an object where we'll set `read`, `write` and `create` to `true`: ```ts title="main.ts" await Deno.open("thoughts.txt", { read: true, write: true, create: true, }); ``` To run this, we will use: ```sh deno main.ts ``` When run, the console will prompt us to allow read access, so we'll say yes (or `y`). Then it's going to ask us for write access, which is pretty cool (and we'll allow that too with `y`), so we've granted both and now we have created a file called `thoughts.txt`. If we wanted to write some data to this file we could make some adjustments to our `main.ts` file. Let's create a variable for our file (called file), then we're going to add `append:true` to the object we pass to the `Deno.open` method (we can also get rid of create I suppose, since the file has already been created): ```ts title="main.ts" const file = await Deno.open("thoughts.txt", { read: true, write: true, append: true, }); ``` Next, below this, we'll make a constant called `encoder`, and make it equal a new text encoder. Then we'll make a second constant called `data`, which will call `encode`. Finally we'll add a string with a newline and some text to `data`: ```ts title="main.ts" const encoder = new TextEncoder(); const data = encoder.encode("\nI think basil is underrated."); ``` Then we'll `await file.Write(data)`, which will take that data and write it to the thoughts file, and finally we'll close the file. ```ts title=main.ts" await file.write(data); file.close(); ``` This time we will run the file with the required permissions: ```sh deno --allow-read --allow-write main.ts ``` If we take a look back at our `thoughts.txt` file it will say "I think basil is underrated". The text has been appended to our file. ### Reading and appending to files There are some other options as well, so let's go back to the top of our file this time instead of using `Deno.open` we'll use `Deno.readFile`. Which means we can remove the second argument object, because we're being very specific about what we actually want to do here. Then we'll console log the file. ```ts title="main.ts" const file = await Deno.readFile("thoughts.txt"); console.log(file); ``` If we run this with: ```sh deno --allow-read main.ts ``` The encoded file will be logged to the console, which isn't quite what I want. I actually want the human readable text. So what I can do here is I can use `Deno.readTextFile` instead of `Deno.readFile`, which will write the text from the file directly to the console. We can also write to the file with `Deno.writeTextFile`. For example: ```ts title="main.ts" await Deno.writeTextFile( "thoughts.txt", "Fall is a great season", ); ``` Which, if we run with `deno --allow-write main.ts`, will overwrite the contents of the `thoughts.txt` file with the string about fall. We can update that code to use `append: true`: ```ts title="main.ts" await Deno.writeTextFile( "thoughts.txt", "\nWinter is the most fun season!", { append: true }, ); ``` If we run it again, with `deno --allow-write main.ts`, it's going to append the second sentence to the end of the file. ### Exploring command line arguments We also have the option to explore command line arguments, so we could say: ```ts title="main.ts" const name = Deno.args[0]; console.log(name); ``` We can run this with our usual deno command, but this time pass in a commandline argument, lets say `Eve`: ```sh deno main.ts Eve ``` The name `Eve` will be logged to the console. If we want to get fancy, we can update the logged template string to pass out a message: ```ts title="main.ts" const name = Deno.args[0]; console.log(`How are you today, ${name}?`); ``` ## Using env variables On the Deno global, we also have environment variables. Let's create one called `home`, and log our home directory to the console: ```ts title="main.ts" const home = Deno.env.get("HOME"); console.log(`Home directory: ${home}`); ``` When run with `deno main.ts`, Deno will request environment access, which we can allow with `y`. Or we can run the command with the `--allow-env` flag, and our home directory will be logged to the console. ### Setting up a simple HTTP server Finally, lets look at our trusty `server` constructor. We can create a handler that returns a response, and then pass that handler to the `Deno.serve` method. ```ts title="main.ts" function handler(): Response { return new Response("It's happening!"); } Deno.serve(handler); ``` When run with ```sh deno --allow-net main.ts ``` We'll see that a server is running and listening on port 8000. We can visit `localhost:8000` in the browser and we should see the text "It's happening!". So there are a ton of these that you can take advantage of but it's very nice to know that we don't have to include an external library for everything, Deno has us covered when it comes to managing errors handling servers and working with the file system. --- # Connect to Mongoose and MongoDB URL: https://docs.deno.com/examples/videos/mongoose --- # Connect to Prisma URL: https://docs.deno.com/examples/videos/prisma --- # Publishing Modules with JSR URL: https://docs.deno.com/examples/videos/publishing_modules_with_jsr ## Transcript and examples [JSR](https://jsr.io) is a registry specifically designed for modern JavaScript projects. JSR - the JavaScript registry - has a bunch of cool features. But if you've used npm before, you might be thinking, "why do I need this and why do I need to learn another one of these?" - Well, first it's optimized for TypeScript. - JSR only supports ES Modules. - And finally, npm is the centralized registry for node projects, but there are other runtimes. Obviously Deno, but you can also use these packages in Bun, Cloudflare workers and more Think of it like a superset. JSR doesn't replace npm, it builds on top of it. So here at [jsr.io](https://jsr.io), you can search for whatever you want. I'm looking for this library called Oak that is a middleware framework for handling HTTP requests. I'll search for it here, and this will take me to [the documentation page](https://jsr.io/@oak/oak). If you want to install a package, all you need to do is add it: ```sh deno add jsr:@oak/oak ``` Then we can use it inside of our file like this. ```javascript import { Application } from "jsr:@oak/oak/application"; import { Router } from "jsr:@oak/oak/router"; const router = new Router(); router.get("/", (context) => { context.response.body = "HEY!"; }); const app = new Application(); app.use(router.routes()); app.use(router.allowedMethods()); app.listen({ port: 8080 }); ``` Pretty cool! But what is it like to publish our own JSR package? It's actually great. JSR packages can depend on other packages from JSR but also on any npm package. Let's build a small library and publish it to JSR. Remember [our `sing` function from earlier](/examples/all-in-one_tooling/), let's make this a function that can be consumed by other people in the JavaScript community. You're welcome everyone. ```typescript export function sing( phrase: string, times: number, ): string { return Array(times).fill(phrase).join(" "); } sing("la", 3); ``` Now if we [head over to jsr.io, we can publish it](https://jsr.io/new). The first time I ever try to publish a package, JSR will ask me which scope I want to publish to. I can create that here. Then I'll create the package name and follow the instructions. Let's try using our new packaga in a project using Vite. The following command will walk us through setting up a new Vite project. ```shell deno run --allow-read --allow-write --allow-env npm:create-vite-extra@latest ``` Now we can import our new package by adding it to our project: ```shell deno add jsr:@eveporcello/sing ``` And then importing it when we need it ```typescript import { sing } from "@eveporcello/sing"; ```  So if I had to give myself a grade on this, I don't even have to give myself a grade. [JSR will give me a grade](https://jsr.io/@eveporcello/sing/score) of 29%, which I don't know. Probably not so good. But this has a whole list of improvements that I can make. I need to add a readme to my package. I need to add examples. All of these different things. So I can on my own time develop this to ensure that I have 100 percent here so that my code is well documented and very consumable by other developers. --- # Build a React app URL: https://docs.deno.com/examples/videos/react_app_video --- # Build a Realtime WebSocket Application URL: https://docs.deno.com/examples/videos/realtime_websocket_app --- # TypeScript and JSX URL: https://docs.deno.com/examples/videos/ts_jsx --- # Build a Vue app URL: https://docs.deno.com/examples/videos/vue_app_video --- # What is Deno? URL: https://docs.deno.com/examples/videos/what_is_deno ## Video description A short introduction to Deno and its history ## Transcript and code Deno is an open source runtime for JavaScript, TypeScript, and WebAssembly projects that's built on V8 and Rust. It's modern, it's fast, it's flexible, and it's secure by default. Deno was created by Ryan Dahl, the creator of Node.js, and in 2018, he gave [a famous talk at JSConf EU](https://www.youtube.com/watch?v=M3BM9TB-8yA) about regrets that he had about Node. And Deno provides solutions to all of them. With the hindsight of someone who's been there, Deno gives us a runtime that's thought a lot about the details. Details like TypeScript support by default. You can run or import TypeScript without installing anything more than the Deno CLI. Deno has a built-in TypeScript compiler, so it'll just run your TypeScript code without any extra configuration. Details like linting, formatting, and testing. Deno is an all-in-one toolchain that you can use to get started with your project without having to use all of your finite time on earth having to configure it. Details like web standards. Deno is built on web standards that you might recognize, like Fetch and WebSockets. You don't have to learn anything new to use them. If you've used them in the browser, you're ready to use them in Deno. Deno is secure by default. You have to specifically enable permissions for sensitive APIs like the network, the file system, environment access. Deno has you opt into these permissions like you would to opt into geolocation in the browser. [In this course](https://www.youtube.com/watch?v=KPTOo4k8-GE&list=PLvvLnBDNuTEov9EBIp3MMfHlBxaKGRWTe), we're going to walk through the most important features of Deno with hands-on activities. Whether you've experimented with Deno in the past, or this is all new to you, I think you're going to like it here. --- # lint/rules/adjacent-overload-signatures.md URL: https://docs.deno.com/lint/rules/adjacent-overload-signatures Requires overload signatures to be adjacent to each other. Overloaded signatures which are not next to each other can lead to code which is hard to read and maintain. **Invalid:** (`bar` is declared in-between `foo` overloads) ```typescript type FooType = { foo(s: string): void; foo(n: number): void; bar(): void; foo(sn: string | number): void; }; ``` ```typescript interface FooInterface { foo(s: string): void; foo(n: number): void; bar(): void; foo(sn: string | number): void; } ``` ```typescript class FooClass { foo(s: string): void; foo(n: number): void; bar(): void {} foo(sn: string | number): void {} } ``` ```typescript export function foo(s: string): void; export function foo(n: number): void; export function bar(): void {} export function foo(sn: string | number): void {} ``` **Valid:** (`bar` is declared after `foo`) ```typescript type FooType = { foo(s: string): void; foo(n: number): void; foo(sn: string | number): void; bar(): void; }; ``` ```typescript interface FooInterface { foo(s: string): void; foo(n: number): void; foo(sn: string | number): void; bar(): void; } ``` ```typescript class FooClass { foo(s: string): void; foo(n: number): void; foo(sn: string | number): void {} bar(): void {} } ``` ```typescript export function foo(s: string): void; export function foo(n: number): void; export function foo(sn: string | number): void {} export function bar(): void {} ``` --- # lint/rules/ban-ts-comment.md URL: https://docs.deno.com/lint/rules/ban-ts-comment Disallows the use of Typescript directives without a comment. Typescript directives reduce the effectiveness of the compiler, something which should only be done in exceptional circumstances. The reason why should be documented in a comment alongside the directive. **Invalid:** ```typescript // @ts-expect-error let a: number = "I am a string"; ``` ```typescript // @ts-ignore let a: number = "I am a string"; ``` ```typescript // @ts-nocheck let a: number = "I am a string"; ``` **Valid:** ```typescript // @ts-expect-error: Temporary workaround (see ticket #422) let a: number = "I am a string"; ``` ```typescript // @ts-ignore: Temporary workaround (see ticket #422) let a: number = "I am a string"; ``` ```typescript // @ts-nocheck: Temporary workaround (see ticket #422) let a: number = "I am a string"; ``` --- # lint/rules/ban-types.md URL: https://docs.deno.com/lint/rules/ban-types Bans the use of primitive wrapper objects (e.g. `String` the object is a wrapper of `string` the primitive) in addition to the non-explicit `Function` type and the misunderstood `Object` type. There are very few situations where primitive wrapper objects are desired and far more often a mistake was made with the case of the primitive type. You also cannot assign a primitive wrapper object to a primitive leading to type issues down the line. For reference, [the TypeScript handbook] also says we shouldn't ever use these wrapper objects. [the TypeScript handbook]: https://www.typescriptlang.org/docs/handbook/declaration-files/do-s-and-don-ts.html#number-string-boolean-symbol-and-object With `Function`, it is better to explicitly define the entire function signature rather than use the non-specific `Function` type which won't give you type safety with the function. Finally, `Object` and `{}` means "any non-nullish value" rather than "any object type". `object` is a good choice for a meaning of "any object type". **Invalid:** ```typescript let a: Boolean; let b: String; let c: Number; let d: Symbol; let e: Function; let f: Object; let g: {}; ``` **Valid:** ```typescript let a: boolean; let b: string; let c: number; let d: symbol; let e: () => number; let f: object; let g: Record; ``` --- # lint/rules/ban-unknown-rule-code.md URL: https://docs.deno.com/lint/rules/ban-unknown-rule-code Warns the usage of unknown rule codes in ignore directives. We sometimes have to suppress and ignore lint errors for some reasons. We can do so using [ignore directives](/go/lint-ignore/) with rule names that should be ignored like so: ```typescript // deno-lint-ignore no-explicit-any no-unused-vars const foo: any = 42; ``` This rule checks for the validity of the specified rule names (i.e. whether `deno_lint` provides the rule or not). **Invalid:** ```typescript // typo // deno-lint-ignore eq-eq-e console.assert(x == 42); // unknown rule name // deno-lint-ignore UNKNOWN_RULE_NAME const b = "b"; ``` **Valid:** ```typescript // deno-lint-ignore eq-eq-eq console.assert(x == 42); // deno-lint-ignore no-unused-vars const b = "b"; ``` --- # lint/rules/ban-untagged-ignore.md URL: https://docs.deno.com/lint/rules/ban-untagged-ignore Requires `deno-lint-ignore` to be annotated with one or more rule names. Ignoring all rules can mask unexpected or future problems. Therefore you need to explicitly specify which rule(s) are to be ignored. **Invalid:** ```typescript // deno-lint-ignore export function duplicateArgumentsFn(a, b, a) {} ``` **Valid:** ```typescript // deno-lint-ignore no-dupe-args export function duplicateArgumentsFn(a, b, a) {} ``` --- # lint/rules/ban-untagged-todo.md URL: https://docs.deno.com/lint/rules/ban-untagged-todo Requires TODOs to be annotated with either a user tag (`@user`) or an issue reference (`#issue`). TODOs without reference to a user or an issue become stale with no easy way to get more information. **Invalid:** ```typescript // TODO Improve calc engine export function calcValue(): number {} ``` ```typescript // TODO Improve calc engine (@djones) export function calcValue(): number {} ``` ```typescript // TODO Improve calc engine (#332) export function calcValue(): number {} ``` **Valid:** ```typescript // TODO(djones) Improve calc engine export function calcValue(): number {} ``` ```typescript // TODO(@djones) Improve calc engine export function calcValue(): number {} ``` ```typescript // TODO(#332) export function calcValue(): number {} ``` ```typescript // TODO(#332) Improve calc engine export function calcValue(): number {} ``` --- # lint/rules/ban-unused-ignore.md URL: https://docs.deno.com/lint/rules/ban-unused-ignore Warns unused ignore directives. We sometimes have to suppress and ignore lint errors for some reasons and we can do so using [ignore directives](/go/lint-ignore/). In some cases, however, like after refactoring, we may end up having ignore directives that are no longer necessary. Such superfluous ignore directives are likely to confuse future code readers, and to make matters worse, might hide future lint errors unintentionally. To prevent such situations, this rule detects unused, superfluous ignore directives. **Invalid:** ```typescript // Actually this line is valid since `export` means "used", // so this directive is superfluous // deno-lint-ignore no-unused-vars export const foo = 42; ``` **Valid:** ```typescript export const foo = 42; ``` --- # lint/rules/button-has-type.md URL: https://docs.deno.com/lint/rules/button-has-type Checks that a `; const btn = ; ``` **Valid:** ```tsx const btn = ; const btn = ; const btn = ; const btn = ; ``` --- # lint/rules/jsx-curly-braces.md URL: https://docs.deno.com/lint/rules/jsx-curly-braces Ensure consistent use of curly braces around JSX expressions. **Invalid:** ```tsx const foo = />; const foo = ; const foo =
    {"foo"}
    ; ``` **Valid:** ```tsx const foo = } />; const foo = ; const foo =
    foo
    ; ``` --- # lint/rules/jsx-key.md URL: https://docs.deno.com/lint/rules/jsx-key Ensure the `key` attribute is present when passing iterables into JSX. It allows frameworks to optimize checking the order of elements. **Invalid:** ```tsx const foo = [
    foo
    ]; const foo = [<>foo]; [1, 2, 3].map(() =>
    ); Array.from([1, 2, 3], () =>
    ); ``` **Valid:** ```tsx const foo = [
    foo
    ]; const foo = [foo]; [1, 2, 3].map((x) =>
    ); Array.from([1, 2, 3], (x) =>
    ); ``` --- # lint/rules/jsx-no-children-prop.md URL: https://docs.deno.com/lint/rules/jsx-no-children-prop Pass children as JSX children instead of as an attribute. **Invalid:** ```tsx
    , ]} /> ``` **Valid:** ```tsx
    foo
    ``` --- # lint/rules/jsx-no-comment-text-nodes.md URL: https://docs.deno.com/lint/rules/jsx-no-comment-text-nodes JavaScript comments inside text nodes are rendered as plain text in JSX. This is often unexpected. **Invalid:** ```tsx
    // comment
    /* comment */
    ``` **Valid:** ```tsx
    {/* comment */}
    ; ``` --- # lint/rules/jsx-no-duplicate-props.md URL: https://docs.deno.com/lint/rules/jsx-no-duplicate-props Disallow duplicated JSX props. Later props will always overwrite earlier props often leading to unexpected results. **Invalid:** ```tsx
    ; ; ; ``` **Valid:** ```tsx
    ``` --- # lint/rules/jsx-no-unescaped-entities.md URL: https://docs.deno.com/lint/rules/jsx-no-unescaped-entities Leaving the `>` or `}` character in JSX is often undesired and difficult to spot. Enforce that these characters must be passed as strings. **Invalid:** ```tsx
    >
    }
    ``` **Valid:** ```tsx
    >
    ,
    {">"}
    ,
    {"}"}
    , ``` --- # lint/rules/jsx-no-useless-fragment.md URL: https://docs.deno.com/lint/rules/jsx-no-useless-fragment Fragments are only necessary at the top of a JSX "block" and only when there are multiple children. Fragments are not needed in other scenarios. **Invalid:** ```tsx <> <>
    <>

    foo <>bar

    ``` **Valid:** ```tsx <>{foo} <>
    <>foo

    foo bar

    ``` --- # lint/rules/jsx-props-no-spread-multi.md URL: https://docs.deno.com/lint/rules/jsx-props-no-spread-multi Spreading the same expression twice is typically a mistake and causes unnecessary computations. **Invalid:** ```tsx
    ``` **Valid:** ```tsx
    ``` --- # lint/rules/jsx-void-dom-elements-no-children.md URL: https://docs.deno.com/lint/rules/jsx-void-dom-elements-no-children Ensure that void elements in HTML don't have any children as that is not valid HTML. See [`Void element` article on MDN](https://developer.mozilla.org/en-US/docs/Glossary/Void_element) for more information. **Invalid:** ```tsx
    foo
    foo ``` **Valid:** ```tsx
    ``` --- # lint/rules/no-array-constructor.md URL: https://docs.deno.com/lint/rules/no-array-constructor Enforce conventional usage of array construction. Array construction is conventionally done via literal notation such as `[]` or `[1, 2, 3]`. Using the `new Array()` is discouraged as is `new Array(1, 2, 3)`. There are two reasons for this. The first is that a single supplied argument defines the array length, while multiple arguments instead populate the array of no fixed size. This confusion is avoided when pre-populated arrays are only created using literal notation. The second argument to avoiding the `Array` constructor is that the `Array` global may be redefined. The one exception to this rule is when creating a new array of fixed size, e.g. `new Array(6)`. This is the conventional way to create arrays of fixed length. **Invalid:** ```typescript // This is 4 elements, not a size 100 array of 3 elements const a = new Array(100, 1, 2, 3); const b = new Array(); // use [] instead ``` **Valid:** ```typescript const a = new Array(100); const b = []; const c = [1, 2, 3]; ``` --- # lint/rules/no-async-promise-executor.md URL: https://docs.deno.com/lint/rules/no-async-promise-executor Requires that async promise executor functions are not used. Promise constructors take an executor function as an argument with `resolve` and `reject` parameters that can be used to control the state of the created Promise. This function is allowed to be async but this is generally not a good idea for several reasons: - If an async executor function throws an error, the error will be lost and won't cause the newly-constructed Promise to reject. This could make it difficult to debug and handle some errors. - If an async Promise executor function is using await, then this is usually a sign that it is not actually necessary to use the new Promise constructor and the code can be restructured to avoid the use of a promise, or the scope of the new Promise constructor can be reduced, extracting the async code and changing it to be synchronous. **Invalid:** ```typescript new Promise(async function (resolve, reject) {}); new Promise(async (resolve, reject) => {}); ``` **Valid:** ```typescript new Promise(function (resolve, reject) {}); new Promise((resolve, reject) => {}); ``` --- # lint/rules/no-await-in-loop.md URL: https://docs.deno.com/lint/rules/no-await-in-loop Requires `await` is not used in a for loop body. Async and await are used in Javascript to provide parallel execution. If each element in the for loop is waited upon using `await`, then this negates the benefits of using async/await as no more elements in the loop can be processed until the current element finishes. A common solution is to refactor the code to run the loop body asynchronously and capture the promises generated. After the loop finishes you can then await all the promises at once. **Invalid:** ```javascript async function doSomething(items) { const results = []; for (const item of items) { // Each item in the array blocks on the previous one finishing results.push(await someAsyncProcessing(item)); } return processResults(results); } ``` **Valid:** ```javascript async function doSomething(items) { const results = []; for (const item of items) { // Kick off all item processing asynchronously... results.push(someAsyncProcessing(item)); } // ...and then await their completion after the loop return processResults(await Promise.all(results)); } ``` --- # lint/rules/no-await-in-sync-fn.md URL: https://docs.deno.com/lint/rules/no-await-in-sync-fn Disallow `await` keyword inside a non-async function. Using the `await` keyword inside a non-async function is a syntax error. To be able to use `await` inside a function, the function needs to be marked as async via the `async` keyword. **Invalid:** ```javascript function foo() { await bar(); } const fooFn = function foo() { await bar(); }; const fooFn = () => { await bar(); }; ``` **Valid:** ```javascript async function foo() { await bar(); } const fooFn = async function foo() { await bar(); }; const fooFn = async () => { await bar(); }; ``` --- # lint/rules/no-boolean-literal-for-arguments.md URL: https://docs.deno.com/lint/rules/no-boolean-literal-for-arguments Requires all functions called with any amount of `boolean` literals as parameters to use a self-documenting constant instead. Is common to define functions that can take `booleans` as arguments. However, passing `boolean` literals as parameters can lead to lack of context regarding the role of the argument inside the function in question. A simple fix for the points mentioned above is the use of self documenting constants that will end up working as "named booleans", that allow for a better understanding on what the parameters mean in the context of the function call. **Invalid:** ```typescript function redraw(allViews: boolean, inline: boolean) { // redraw logic. } redraw(true, true); function executeCommand(recursive: boolean, executionMode: EXECUTION_MODES) { // executeCommand logic. } executeCommand(true, EXECUTION_MODES.ONE); function enableLogs(enable: boolean) { // enabledLogs logic. } enableLogs(true); ``` **Valid:** ```typescript function redraw(allViews: boolean, inline: boolean) { // redraw logic. } const ALL_VIEWS = true, INLINE = true; redraw(ALL_VIEWS, INLINE); function executeCommand(recursive: boolean, executionMode: EXECUTION_MODES) { // executeCommand logic. } const RECURSIVE = true; executeCommand(RECURSIVE, EXECUTION_MODES.ONE); function enableLogs(enable: boolean) { // enabledLogs logic. } const ENABLE = true; enableLogs(ENABLE); ``` --- # lint/rules/no-case-declarations.md URL: https://docs.deno.com/lint/rules/no-case-declarations Requires lexical declarations (`let`, `const`, `function` and `class`) in switch `case` or `default` clauses to be scoped with brackets. Without brackets in the `case` or `default` block, the lexical declarations are visible to the entire switch block but only get initialized when they are assigned, which only happens if that case/default is reached. This can lead to unexpected errors. The solution is to ensure each `case` or `default` block is wrapped in brackets to scope limit the declarations. **Invalid:** ```typescript switch (choice) { // `let`, `const`, `function` and `class` are scoped the entire switch statement here case 1: let a = "choice 1"; break; case 2: const b = "choice 2"; break; case 3: function f() { return "choice 3"; } break; default: class C {} } ``` **Valid:** ```typescript switch (choice) { // The following `case` and `default` clauses are wrapped into blocks using brackets case 1: { let a = "choice 1"; break; } case 2: { const b = "choice 2"; break; } case 3: { function f() { return "choice 3"; } break; } default: { class C {} } } ``` --- # lint/rules/no-class-assign.md URL: https://docs.deno.com/lint/rules/no-class-assign Disallows modifying variables of class declarations. Declaring a class such as `class A {}`, creates a variable `A`. Like any variable this can be modified or reassigned. In most cases this is a mistake and not what was intended. **Invalid:** ```typescript class A {} A = 0; // reassigning the class variable itself ``` **Valid:** ```typescript class A {} let c = new A(); c = 0; // reassigning the variable `c` ``` --- # lint/rules/no-compare-neg-zero.md URL: https://docs.deno.com/lint/rules/no-compare-neg-zero Disallows comparing against negative zero (`-0`). Comparing a value directly against negative may not work as expected as it will also pass for non-negative zero (i.e. `0` and `+0`). Explicit comparison with negative zero can be performed using `Object.is`. **Invalid:** ```typescript if (x === -0) {} ``` **Valid:** ```typescript if (x === 0) {} if (Object.is(x, -0)) {} ``` --- # lint/rules/no-cond-assign.md URL: https://docs.deno.com/lint/rules/no-cond-assign Disallows the use of the assignment operator, `=`, in conditional statements. Use of the assignment operator within a conditional statement is often the result of mistyping the equality operator, `==`. If an assignment within a conditional statement is required then this rule allows it by wrapping the assignment in parentheses. **Invalid:** ```typescript let x; if (x = 0) { let b = 1; } ``` ```typescript function setHeight(someNode) { do { someNode.height = "100px"; } while (someNode = someNode.parentNode); } ``` **Valid:** ```typescript let x; if (x === 0) { let b = 1; } ``` ```typescript function setHeight(someNode) { do { someNode.height = "100px"; } while ((someNode = someNode.parentNode)); } ``` --- # lint/rules/no-console.md URL: https://docs.deno.com/lint/rules/no-console Disallows the use of the `console` global. Oftentimes, developers accidentally commit `console.log`/`console.error` statements, left in particularly after debugging. Moreover, using these in code may leak sensitive information to the output or clutter the console with unnecessary information. This rule helps maintain clean and secure code by disallowing the use of `console`. This rule is especially useful in libraries where you almost never want to output to the console. **Invalid:** ```typescript console.log("Debug message"); console.error("Debug message"); console.debug(obj); if (debug) console.log("Debugging"); function log() { console.log("Log"); } ``` **Valid:** It is recommended to explicitly enable the console via a `deno-lint-ignore` comment for any calls where you actually want to use it. ```typescript function logWarning(message: string) { // deno-lint-ignore no-console console.warn(message); } ``` --- # lint/rules/no-const-assign.md URL: https://docs.deno.com/lint/rules/no-const-assign Disallows modifying a variable declared as `const`. Modifying a variable declared as `const` will result in a runtime error. **Invalid:** ```typescript const a = 0; a = 1; a += 1; a++; ++a; ``` **Valid:** ```typescript const a = 0; const b = a + 1; // `c` is out of scope on each loop iteration, allowing a new assignment for (const c in [1, 2, 3]) {} ``` --- # lint/rules/no-constant-condition.md URL: https://docs.deno.com/lint/rules/no-constant-condition Disallows the use of a constant expression in conditional test. Using a constant expression in a conditional test is often either a mistake or a temporary situation introduced during development and is not ready for production. **Invalid:** ```typescript if (true) {} if (2) {} do {} while (x = 2); // infinite loop ``` **Valid:** ```typescript if (x) {} if (x === 0) {} do {} while (x === 2); ``` --- # lint/rules/no-control-regex.md URL: https://docs.deno.com/lint/rules/no-control-regex Disallows the use ASCII control characters in regular expressions. Control characters are invisible characters in the ASCII range of 0-31. It is uncommon to use these in a regular expression and more often it is a mistake in the regular expression. **Invalid:** ```typescript // Examples using ASCII (31) Carriage Return (hex x0d) const pattern1 = /\x0d/; const pattern2 = /\u000d/; const pattern3 = new RegExp("\\x0d"); const pattern4 = new RegExp("\\u000d"); ``` **Valid:** ```typescript // Examples using ASCII (32) Space (hex x20) const pattern1 = /\x20/; const pattern2 = /\u0020/; const pattern3 = new RegExp("\\x20"); const pattern4 = new RegExp("\\u0020"); ``` --- # lint/rules/no-debugger.md URL: https://docs.deno.com/lint/rules/no-debugger Disallows the use of the `debugger` statement. `debugger` is a statement which is meant for stopping the javascript execution environment and start the debugger at the statement. Modern debuggers and tooling no longer need this statement and leaving it in can cause the execution of your code to stop in production. **Invalid:** ```typescript function isLongString(x: string) { debugger; return x.length > 100; } ``` **Valid:** ```typescript function isLongString(x: string) { return x.length > 100; // set breakpoint here instead } ``` --- # lint/rules/no-delete-var.md URL: https://docs.deno.com/lint/rules/no-delete-var Disallows the deletion of variables. `delete` is used to remove a property from an object. Variables declared via `var`, `let` and `const` cannot be deleted (`delete` will return `false`). Setting `strict` mode on will raise a syntax error when attempting to delete a variable. **Invalid:** ```typescript const a = 1; let b = 2; let c = 3; delete a; // would return false delete b; // would return false delete c; // would return false ``` **Valid:** ```typescript let obj = { a: 1, }; delete obj.a; // return true ``` --- # lint/rules/no-deprecated-deno-api.md URL: https://docs.deno.com/lint/rules/no-deprecated-deno-api Warns the usage of the deprecated - Deno APIs. The following APIs will be removed from the `Deno.*` namespace but have newer APIs to migrate to. See the [Deno 1.x to 2.x Migration Guide](https://docs.deno.com/runtime/manual/advanced/migrate_deprecations) for migration instructions. - `Deno.Buffer` - `Deno.Closer` - `Deno.close()` - `Deno.Conn.rid` - `Deno.copy()` - `Deno.customInspect` - `Deno.File` - `Deno.fstatSync()` - `Deno.fstat()` - `Deno.FsWatcher.rid` - `Deno.ftruncateSync()` - `Deno.ftruncate()` - `Deno.futimeSync()` - `Deno.futime()` - `Deno.isatty()` - `Deno.Listener.rid` - `Deno.ListenTlsOptions.certFile` - `Deno.ListenTlsOptions.keyFile` - `Deno.readAllSync()` - `Deno.readAll()` - `Deno.Reader` - `Deno.ReaderSync` - `Deno.readSync()` - `Deno.read()` - `Deno.run()` - `Deno.seekSync()` - `Deno.seek()` - `Deno.serveHttp()` - `Deno.Server` - `Deno.shutdown` - `Deno.stderr.rid` - `Deno.stdin.rid` - `Deno.stdout.rid` - `Deno.TlsConn.rid` - `Deno.UnixConn.rid` - `Deno.writeAllSync()` - `Deno.writeAll()` - `Deno.Writer` - `Deno.WriterSync` - `Deno.writeSync()` - `Deno.write()` - `new Deno.FsFile()` The following APIs will be removed from the `Deno.*` namespace without replacement. - `Deno.resources()` - `Deno.metrics()` --- # lint/rules/no-dupe-args.md URL: https://docs.deno.com/lint/rules/no-dupe-args Disallows using an argument name more than once in a function signature. If you supply multiple arguments of the same name to a function, the last instance will shadow the preceding one(s). This is most likely an unintentional typo. **Invalid:** ```typescript function withDupes(a, b, a) { console.log("I'm the value of the second a:", a); } ``` **Valid:** ```typescript function withoutDupes(a, b, c) { console.log("I'm the value of the first (and only) a:", a); } ``` --- # lint/rules/no-dupe-class-members.md URL: https://docs.deno.com/lint/rules/no-dupe-class-members Disallows using a class member function name more than once. Declaring a function of the same name twice in a class will cause the previous declaration(s) to be overwritten, causing unexpected behaviors. **Invalid:** ```typescript class Foo { bar() {} bar() {} } ``` **Valid:** ```typescript class Foo { bar() {} fizz() {} } ``` --- # lint/rules/no-dupe-else-if.md URL: https://docs.deno.com/lint/rules/no-dupe-else-if Disallows using the same condition twice in an `if`/`else if` statement. When you reuse a condition in an `if`/`else if` statement, the duplicate condition will never be reached (without unusual side-effects) meaning this is almost always a bug. **Invalid:** ```typescript if (a) {} else if (b) {} else if (a) {} // duplicate of condition above if (a === 5) {} else if (a === 6) {} else if (a === 5) {} // duplicate of condition above ``` **Valid:** ```typescript if (a) {} else if (b) {} else if (c) {} if (a === 5) {} else if (a === 6) {} else if (a === 7) {} ``` --- # lint/rules/no-dupe-keys.md URL: https://docs.deno.com/lint/rules/no-dupe-keys Disallows duplicate keys in object literals. Setting the same key multiple times in an object literal will override other assignments to that key and can cause unexpected behaviour. **Invalid:** ```typescript const foo = { bar: "baz", bar: "qux", }; ``` ```typescript const foo = { "bar": "baz", bar: "qux", }; ``` ```typescript const foo = { 0x1: "baz", 1: "qux", }; ``` **Valid:** ```typescript const foo = { bar: "baz", quxx: "qux", }; ``` --- # lint/rules/no-duplicate-case.md URL: https://docs.deno.com/lint/rules/no-duplicate-case Disallows using the same case clause in a switch statement more than once. When you reuse a case test expression in a `switch` statement, the duplicate case will never be reached meaning this is almost always a bug. **Invalid:** ```typescript const someText = "a"; switch (someText) { case "a": // (1) break; case "b": break; case "a": // duplicate of (1) break; default: break; } ``` **Valid:** ```typescript const someText = "a"; switch (someText) { case "a": break; case "b": break; case "c": break; default: break; } ``` --- # lint/rules/no-empty-character-class.md URL: https://docs.deno.com/lint/rules/no-empty-character-class Disallows using the empty character class in a regular expression. Regular expression character classes are a series of characters in brackets, e.g. `[abc]`. if nothing is supplied in the brackets it will not match anything which is likely a typo or mistake. **Invalid:** ```typescript /^abc[]/.test("abcdefg"); // false, as `d` does not match an empty character class "abcdefg".match(/^abc[]/); // null ``` **Valid:** ```typescript // Without a character class /^abc/.test("abcdefg"); // true "abcdefg".match(/^abc/); // ["abc"] // With a valid character class /^abc[a-z]/.test("abcdefg"); // true "abcdefg".match(/^abc[a-z]/); // ["abcd"] ``` --- # lint/rules/no-empty-enum.md URL: https://docs.deno.com/lint/rules/no-empty-enum Disallows the declaration of an empty enum. An enum with no members serves no purpose. This rule will capture these situations as either unnecessary code or a mistaken empty implementation. **Invalid:** ```typescript enum Foo {} ``` **Valid:** ```typescript enum Foo { ONE = "ONE", } ``` --- # lint/rules/no-empty-interface.md URL: https://docs.deno.com/lint/rules/no-empty-interface Disallows the declaration of an empty interface. An interface with no members serves no purpose. This rule will capture these situations as either unnecessary code or a mistaken empty implementation. **Invalid:** ```typescript interface Foo {} ``` **Valid:** ```typescript interface Foo { name: string; } interface Bar { age: number; } // Using an empty interface with at least one extension are allowed. // Using an empty interface to change the identity of Baz from type to interface. type Baz = { profession: string }; interface Foo extends Baz {} // Using an empty interface to extend already existing Foo declaration // with members of the Bar interface interface Foo extends Bar {} // Using an empty interface as a union type interface Baz extends Foo, Bar {} ``` --- # lint/rules/no-empty-pattern.md URL: https://docs.deno.com/lint/rules/no-empty-pattern Disallows the use of empty patterns in destructuring. In destructuring, it is possible to use empty patterns such as `{}` or `[]` which have no effect, most likely not what the author intended. **Invalid:** ```typescript // In these examples below, {} and [] are not object literals or empty arrays, // but placeholders for destructured variable names const {} = someObj; const [] = someArray; const {a: {}} = someObj; const [a: []] = someArray; function myFunc({}) {} function myFunc([]) {} ``` **Valid:** ```typescript const { a } = someObj; const [a] = someArray; // Correct way to default destructured variable to object literal const { a = {} } = someObj; // Correct way to default destructured variable to empty array const [a = []] = someArray; function myFunc({ a }) {} function myFunc({ a = {} }) {} function myFunc([a]) {} function myFunc([a = []]) {} ``` --- # lint/rules/no-empty.md URL: https://docs.deno.com/lint/rules/no-empty Disallows the use of empty block statements. Empty block statements are legal but often represent that something was missed and can make code less readable. This rule ignores block statements that only contain comments. This rule also ignores empty constructors and function bodies (including arrow functions). **Invalid:** ```typescript if (foo) {} while (foo) {} switch (foo) {} try { doSomething(); } catch (e) { } finally { } ``` **Valid:** ```typescript if (foo) { // empty } while (foo) { /* empty */ } try { doSomething(); } catch (e) { // continue regardless of error } try { doSomething(); } finally { /* continue regardless of error */ } ``` --- # lint/rules/no-eval.md URL: https://docs.deno.com/lint/rules/no-eval Disallows the use of `eval`. `eval` is a potentially dangerous function which can open your code to a number of security vulnerabilities. In addition to being slow, `eval` is also often unnecessary with better solutions available. **Invalid:** ```typescript const obj = { x: "foo" }; const key = "x", const value = eval("obj." + key); ``` **Valid:** ```typescript const obj = { x: "foo" }; const value = obj[x]; ``` --- # lint/rules/no-ex-assign.md URL: https://docs.deno.com/lint/rules/no-ex-assign Disallows the reassignment of exception parameters. There is generally no good reason to reassign an exception parameter. Once reassigned the code from that point on has no reference to the error anymore. **Invalid:** ```typescript try { someFunc(); } catch (e) { e = true; // can no longer access the thrown error } ``` **Valid:** ```typescript try { someFunc(); } catch (e) { const anotherVar = true; } ``` --- # lint/rules/no-explicit-any.md URL: https://docs.deno.com/lint/rules/no-explicit-any Disallows use of the `any` type. Use of the `any` type disables the type check system around that variable, defeating the purpose of Typescript which is to provide type safe code. Additionally, the use of `any` hinders code readability, since it is not immediately clear what type of value is being referenced. It is better to be explicit about all types. For a more type-safe alternative to `any`, use `unknown` if you are unable to choose a more specific type. **Invalid:** ```typescript const someNumber: any = "two"; function foo(): any { return undefined; } ``` **Valid:** ```typescript const someNumber: string = "two"; function foo(): undefined { return undefined; } ``` --- # lint/rules/no-external-import.md URL: https://docs.deno.com/lint/rules/no-external-import Disallows the use of external imports. - What's the motivation of this lint rule? - This rule emits warnings if external modules are imported via URL. "deps.ts" and import maps are exception. - Why is linted code considered bad? - Importing external modules just works fine, but it will take time and effort when you want to upgrade those modules if they are imported in multiple places in your project. - When should it be used? - To avoid it you could use "deps.ts convention" or [import maps](https://docs.deno.com/runtime/manual/basics/import_maps), where you import all external modules and then re-export them or assign aliases to them. - If you'd like to follow the "deps.ts convention" or use import maps. **Invalid:** ```typescript import { assertEquals } from "https://deno.land/std@0.126.0/testing/asserts.ts"; ``` **Valid:** ```typescript import { assertEquals } from "./deps.ts"; ``` ```typescript // deps.ts export { assert, assertEquals, assertStringIncludes, } from "https://deno.land/std@0.126.0/testing/asserts.ts"; ``` you can refer to the explanation of this convention here https://docs.deno.com/runtime/manual/basics/modules/#it-seems-unwieldy-to-import-urls-everywhere --- # lint/rules/no-extra-boolean-cast.md URL: https://docs.deno.com/lint/rules/no-extra-boolean-cast Disallows unnecessary boolean casts. In certain contexts, such as `if`, `while` or `for` statements, expressions are automatically coerced into a boolean. Therefore, techniques such as double negation (`!!foo`) or casting (`Boolean(foo)`) are unnecessary and produce the same result as without the negation or casting. **Invalid:** ```typescript if (!!foo) {} if (Boolean(foo)) {} while (!!foo) {} for (; Boolean(foo);) {} ``` **Valid:** ```typescript if (foo) {} while (foo) {} for (; foo;) {} ``` --- # lint/rules/no-extra-non-null-assertion.md URL: https://docs.deno.com/lint/rules/no-extra-non-null-assertion Disallows unnecessary non-null assertions. Non-null assertions are specified with an `!` saying to the compiler that you know this value is not null. Specifying this operator more than once in a row, or in combination with the optional chaining operator (`?`) is confusing and unnecessary. **Invalid:** ```typescript const foo: { str: string } | null = null; const bar = foo!!.str; function myFunc(bar: undefined | string) { return bar!!; } function anotherFunc(bar?: { str: string }) { return bar!?.str; } ``` **Valid:** ```typescript const foo: { str: string } | null = null; const bar = foo!.str; function myFunc(bar: undefined | string) { return bar!; } function anotherFunc(bar?: { str: string }) { return bar?.str; } ``` --- # lint/rules/no-fallthrough.md URL: https://docs.deno.com/lint/rules/no-fallthrough Disallows the implicit fallthrough of case statements. Case statements without a `break` will execute their body and then fallthrough to the next case or default block and execute this block as well. While this is sometimes intentional, many times the developer has forgotten to add a break statement, intending only for a single case statement to be executed. This rule enforces that you either end each case statement with a break statement or an explicit comment that fallthrough was intentional. The fallthrough comment must contain one of `fallthrough`, `falls through` or `fall through`. **Invalid:** ```typescript switch (myVar) { case 1: console.log("1"); case 2: console.log("2"); } // If myVar = 1, outputs both `1` and `2`. Was this intentional? ``` **Valid:** ```typescript switch (myVar) { case 1: console.log("1"); break; case 2: console.log("2"); break; } // If myVar = 1, outputs only `1` switch (myVar) { case 1: console.log("1"); /* falls through */ case 2: console.log("2"); } // If myVar = 1, intentionally outputs both `1` and `2` ``` --- # lint/rules/no-func-assign.md URL: https://docs.deno.com/lint/rules/no-func-assign Disallows the overwriting/reassignment of an existing function. Javascript allows for the reassignment of a function definition. This is generally a mistake on the developers part, or poor coding practice as code readability and maintainability will suffer. **Invalid:** ```typescript function foo() {} foo = bar; const a = function baz() { baz = "now I'm a string"; }; myFunc = existingFunc; function myFunc() {} ``` **Valid:** ```typescript function foo() {} const someVar = foo; const a = function baz() { const someStr = "now I'm a string"; }; const anotherFuncRef = existingFunc; let myFuncVar = function () {}; myFuncVar = bar; // variable reassignment, not function re-declaration ``` --- # lint/rules/no-global-assign.md URL: https://docs.deno.com/lint/rules/no-global-assign Disallows assignment to native Javascript objects. In Javascript, `String` and `Object` for example are native objects. Like any object, they can be reassigned, but it is almost never wise to do so as this can lead to unexpected results and difficult to track down bugs. **Invalid:** ```typescript Object = null; undefined = true; window = {}; ``` --- # lint/rules/no-implicit-declare-namespace-export.md URL: https://docs.deno.com/lint/rules/no-implicit-declare-namespace-export Disallows the use of implicit exports in ["ambient" namespaces]. TypeScript implicitly export all members of an ["ambient" namespaces], except whether a named export is present. ["ambient" namespaces]: https://www.typescriptlang.org/docs/handbook/namespaces.html#ambient-namespaces **Invalid:** ```ts // foo.ts or foo.d.ts declare namespace ns { interface ImplicitlyExported {} export type Exported = true; } ``` **Valid:** ```ts // foo.ts or foo.d.ts declare namespace ns { interface NonExported {} export {}; } declare namespace ns { interface Exported {} export { Exported }; } declare namespace ns { export interface Exported {} } ``` --- # lint/rules/no-import-assertions.md URL: https://docs.deno.com/lint/rules/no-import-assertions Disallows the `assert` keyword for import attributes. ES import attributes (previously called import assertions) has been changed to use the `with` keyword. The old syntax using `assert` is still supported, but deprecated. **Invalid:** ```typescript import obj from "./obj.json" assert { type: "json" }; import("./obj2.json", { assert: { type: "json" } }); ``` **Valid:** ```typescript import obj from "./obj.json" with { type: "json" }; import("./obj2.json", { with: { type: "json" } }); ``` --- # lint/rules/no-import-assign.md URL: https://docs.deno.com/lint/rules/no-import-assign Disallows reassignment of imported module bindings. ES module import bindings should be treated as read-only since modifying them during code execution will likely result in runtime errors. It also makes for poor code readability and difficult maintenance. **Invalid:** ```typescript import defaultMod, { namedMod } from "./mod.js"; import * as modNameSpace from "./mod2.js"; defaultMod = 0; namedMod = true; modNameSpace.someExportedMember = "hello"; modNameSpace = {}; ``` **Valid:** ```typescript import defaultMod, { namedMod } from "./mod.js"; import * as modNameSpace from "./mod2.js"; // properties of bound imports may be set defaultMod.prop = 1; namedMod.prop = true; modNameSpace.someExportedMember.prop = "hello"; ``` --- # lint/rules/no-inferrable-types.md URL: https://docs.deno.com/lint/rules/no-inferrable-types Disallows easily inferrable types. Variable initializations to JavaScript primitives (and `null`) are obvious in their type. Specifying their type can add additional verbosity to the code. For example, with `const x: number = 5`, specifying `number` is unnecessary as it is obvious that `5` is a number. **Invalid:** ```typescript const a: bigint = 10n; const b: bigint = BigInt(10); const c: boolean = true; const d: boolean = !0; const e: number = 10; const f: number = Number("1"); const g: number = Infinity; const h: number = NaN; const i: null = null; const j: RegExp = /a/; const k: RegExp = RegExp("a"); const l: RegExp = new RegExp("a"); const m: string = "str"; const n: string = `str`; const o: string = String(1); const p: symbol = Symbol("a"); const q: undefined = undefined; const r: undefined = void someValue; class Foo { prop: number = 5; } function fn(s: number = 5, t: boolean = true) {} ``` **Valid:** ```typescript const a = 10n; const b = BigInt(10); const c = true; const d = !0; const e = 10; const f = Number("1"); const g = Infinity; const h = NaN; const i = null; const j = /a/; const k = RegExp("a"); const l = new RegExp("a"); const m = "str"; const n = `str`; const o = String(1); const p = Symbol("a"); const q = undefined; const r = void someValue; class Foo { prop = 5; } function fn(s = 5, t = true) {} ``` --- # lint/rules/no-inner-declarations.md URL: https://docs.deno.com/lint/rules/no-inner-declarations Disallows variable or function definitions in nested blocks. Function declarations in nested blocks can lead to less readable code and potentially unexpected results due to compatibility issues in different JavaScript runtimes. This does not apply to named or anonymous functions which are valid in a nested block context. Variables declared with `var` in nested blocks can also lead to less readable code. Because these variables are hoisted to the module root, it is best to declare them there for clarity. Note that variables declared with `let` or `const` are block scoped and therefore this rule does not apply to them. **Invalid:** ```typescript if (someBool) { function doSomething() {} } function someFunc(someVal: number): void { if (someVal > 4) { var a = 10; } } ``` **Valid:** ```typescript function doSomething() {} if (someBool) {} var a = 10; function someFunc(someVal: number): void { var foo = true; if (someVal > 4) { let b = 10; const fn = function doSomethingElse() {}; } } ``` --- # lint/rules/no-invalid-regexp.md URL: https://docs.deno.com/lint/rules/no-invalid-regexp Disallows specifying invalid regular expressions in RegExp constructors. Specifying an invalid regular expression literal will result in a SyntaxError at compile time, however specifying an invalid regular expression string in the RegExp constructor will only be discovered at runtime. **Invalid:** ```typescript const invalidRegExp = new RegExp(")"); ``` **Valid:** ```typescript const goodRegExp = new RegExp("."); ``` --- # lint/rules/no-invalid-triple-slash-reference.md URL: https://docs.deno.com/lint/rules/no-invalid-triple-slash-reference Warns the wrong usage of triple-slash reference directives. Deno supports the triple-slash reference directives of `types`, `path`, `lib`, and `no-default-lib`. This lint rule checks if there is an invalid, badly-formed directive because it is most likely a mistake. Additionally, note that only the `types` directive is allowed in JavaScript files. This directive is useful for telling the TypeScript compiler the location of a type definition file that corresponds to a certain JavaScript file. However, even in the Deno manual of the versions prior to v1.10 (e.g. [v1.9.2]), there was a wrong statement describing that one should use the `path` directive in such cases. Actually, the `types` directive should be used. See [the latest manual] for more detail. So this rule also detects the usage of the directive other than `types` in JavaScript files and suggests replacing it with the `types` directive. [v1.9.2]: https://deno.land/manual@v1.9.2/typescript/types#using-the-triple-slash-reference-directive [the latest manual]: https://deno.land/manual/typescript/types#using-the-triple-slash-reference-directive **Invalid:** _JavaScript_ ```javascript /// /// /// // ... the rest of the JavaScript ... ``` _TypeScript_ ```typescript /// // ... the rest of the TypeScript ... ``` **Valid:** _JavaScript_ ```javascript /// /// // ... the rest of the JavaScript ... ``` _TypeScript_ ```typescript /// /// /// /// // ... the rest of the TypeScript ... ``` --- # lint/rules/no-irregular-whitespace.md URL: https://docs.deno.com/lint/rules/no-irregular-whitespace Disallows the use of non-space or non-tab whitespace characters. Non-space or non-tab whitespace characters can be very difficult to spot in your code as editors will often render them invisibly. These invisible characters can cause issues or unexpected behaviors. Sometimes these characters are added inadvertently through copy/paste or incorrect keyboard shortcuts. The following characters are disallowed: ``` \u000B - Line Tabulation (\v) - \u000C - Form Feed (\f) - \u00A0 - No-Break Space - \u0085 - Next Line \u1680 - Ogham Space Mark \u180E - Mongolian Vowel Separator - \ufeff - Zero Width No-Break Space - \u2000 - En Quad \u2001 - Em Quad \u2002 - En Space - \u2003 - Em Space - \u2004 - Tree-Per-Em \u2005 - Four-Per-Em \u2006 - Six-Per-Em \u2007 - Figure Space \u2008 - Punctuation Space - \u2009 - Thin Space \u200A - Hair Space \u200B - Zero Width Space - \u2028 - Line Separator \u2029 - Paragraph Separator \u202F - Narrow No-Break Space \u205f - Medium Mathematical Space \u3000 - Ideographic Space ``` To fix this linting issue, replace instances of the above with regular spaces, tabs or new lines. If it's not obvious where the offending character(s) are try retyping the line from scratch. --- # lint/rules/no-misused-new.md URL: https://docs.deno.com/lint/rules/no-misused-new Disallows defining `constructor`s for interfaces or `new` for classes Specifying a `constructor` for an interface or defining a `new` method for a class is incorrect and should be avoided. **Invalid:** ```typescript class C { new(): C; } interface I { constructor(): void; } ``` **Valid:** ```typescript class C { constructor() {} } interface I { new (): C; } ``` --- # lint/rules/no-namespace.md URL: https://docs.deno.com/lint/rules/no-namespace Disallows the use of `namespace` and `module` keywords in TypeScript code. `namespace` and `module` are both thought of as outdated keywords to organize the code. Instead, it is generally preferable to use ES2015 module syntax (e.g. `import`/`export`). However, this rule still allows the use of these keywords in the following two cases: - they are used for defining ["ambient" namespaces] along with `declare` keywords - they are written in TypeScript's type definition files: `.d.ts` ["ambient" namespaces]: https://www.typescriptlang.org/docs/handbook/namespaces.html#ambient-namespaces **Invalid:** ```typescript // foo.ts module mod {} namespace ns {} ``` ```dts // bar.d.ts // all usage of `module` and `namespace` keywords are allowed in `.d.ts` ``` **Valid:** ```typescript // foo.ts declare global {} declare module mod1 {} declare module "mod2" {} declare namespace ns {} ``` ```dts // bar.d.ts module mod1 {} namespace ns1 {} declare global {} declare module mod2 {} declare module "mod3" {} declare namespace ns2 {} ``` --- # lint/rules/no-new-symbol.md URL: https://docs.deno.com/lint/rules/no-new-symbol Disallows the use of `new` operators with built-in `Symbol`s. `Symbol`s are created by being called as a function, but we sometimes call it with the `new` operator by mistake. This rule detects such wrong usage of the `new` operator. **Invalid:** ```typescript const foo = new Symbol("foo"); ``` **Valid:** ```typescript const foo = Symbol("foo"); function func(Symbol: typeof SomeClass) { // This `Symbol` is not built-in one const bar = new Symbol(); } ``` --- # lint/rules/no-node-globals.md URL: https://docs.deno.com/lint/rules/no-node-globals Disallows the use of NodeJS global objects. NodeJS exposes a set of global objects that differs from deno (and the web), so code should not assume they are available. Instead, import the objects from their defining modules as needed. **Invalid:** ```typescript // foo.ts const buf = Buffer.from("foo", "utf-8"); // Buffer is not a global object in deno ``` **Valid:** ```typescript // foo.ts import { Buffer } from "node:buffer"; const foo = Buffer.from("foo", "utf-8"); ``` --- # lint/rules/no-non-null-asserted-optional-chain.md URL: https://docs.deno.com/lint/rules/no-non-null-asserted-optional-chain Disallow non-null assertions after an optional chain expression. `?.` optional chain expressions provide undefined if an object is `null` or `undefined`. Using a `!` non-null assertion to assert the result of an `?.` optional chain expression is non-nullable is likely wrong. **Invalid:** ```typescript foo?.bar!; foo?.bar()!; ``` **Valid:** ```typescript foo?.bar; foo?.bar(); ``` --- # lint/rules/no-non-null-assertion.md URL: https://docs.deno.com/lint/rules/no-non-null-assertion Disallow non-null assertions using the `!` postfix operator. TypeScript's `!` non-null assertion operator asserts to the type system that an expression is non-nullable, as in not `null` or `undefined`. Using assertions to tell the type system new information is often a sign that code is not fully type-safe. It's generally better to structure program logic so that TypeScript understands when values may be nullable. **Invalid:** ```typescript interface Example { property?: string; } declare const example: Example; const includes = example.property!.includes("foo"); ``` **Valid:** ```typescript interface Example { property?: string; } declare const example: Example; const includes = example.property?.includes("foo") ?? false; ``` --- # lint/rules/no-obj-calls.md URL: https://docs.deno.com/lint/rules/no-obj-calls Disallows calling built-in global objects like functions. The following built-in objects should not be invoked like functions, even though they look like constructors: - `Math` - `JSON` - `Reflect` - `Atomics` Calling these as functions would result in runtime errors. This rule statically prevents such wrong usage of them. **Invalid:** ```typescript const math = Math(); const newMath = new Math(); const json = JSON(); const newJSON = new JSON(); const reflect = Reflect(); const newReflect = new Reflect(); const atomics = Atomics(); const newAtomics = new Atomics(); ``` **Valid:** ```typescript const area = (radius: number): number => Math.PI * radius * radius; const parsed = JSON.parse("{ foo: 42 }"); const x = Reflect.get({ x: 1, y: 2 }, "x"); const first = Atomics.load(foo, 0); ``` --- # lint/rules/no-octal.md URL: https://docs.deno.com/lint/rules/no-octal Disallows expressing octal numbers via numeric literals beginning with `0`. Octal numbers can be expressed via numeric literals with leading `0` like `042`, but this expression often confuses programmers. That's why ECMAScript's strict mode throws `SyntaxError` for the expression. Since ES2015, the other prefix `0o` has been introduced as an alternative. This new one is always encouraged to use in today's code. **Invalid:** ```typescript const a = 042; const b = 7 + 042; ``` **Valid:** ```typescript const a = 0o42; const b = 7 + 0o42; const c = "042"; ``` --- # lint/rules/no-process-global.md URL: https://docs.deno.com/lint/rules/no-process-global Disallows the use of NodeJS `process` global. NodeJS and Deno expose `process` global but they are hard to statically analyze by tools, so code should not assume they are available. Instead, `import process from "node:process"`. **Invalid:** ```typescript // foo.ts const foo = process.env.FOO; ``` **Valid:** ```typescript // foo.ts import process from "node:process"; const foo = process.env.FOO; ``` --- # lint/rules/no-prototype-builtins.md URL: https://docs.deno.com/lint/rules/no-prototype-builtins Disallows the use of `Object.prototype` builtins directly. If objects are created via `Object.create(null)` they have no prototype specified. This can lead to runtime errors when you assume objects have properties from `Object.prototype` and attempt to call the following methods: - `hasOwnProperty` - `isPrototypeOf` - `propertyIsEnumerable` Instead, it's always encouraged to call these methods from `Object.prototype` explicitly. **Invalid:** ```typescript const a = foo.hasOwnProperty("bar"); const b = foo.isPrototypeOf("bar"); const c = foo.propertyIsEnumerable("bar"); ``` **Valid:** ```typescript const a = Object.prototype.hasOwnProperty.call(foo, "bar"); const b = Object.prototype.isPrototypeOf.call(foo, "bar"); const c = Object.prototype.propertyIsEnumerable.call(foo, "bar"); ``` --- # lint/rules/no-redeclare.md URL: https://docs.deno.com/lint/rules/no-redeclare Disallows redeclaration of variables, functions, parameters with the same name. JavaScript allows us to redeclare variables with the same name using `var`, but redeclaration should not be used since it can make variables hard to trace. In addition, this lint rule disallows redeclaration using `let` or `const` as well, although ESLint allows. This is useful because we can notice a syntax error before actually running the code. As for functions and parameters, JavaScript just treats these as runtime errors, throwing `SyntaxError` when being run. It's also beneficial to detect this sort of errors statically. **Invalid:** ```typescript var a = 3; var a = 10; let b = 3; let b = 10; const c = 3; const c = 10; function d() {} function d() {} function e(arg: number) { var arg: number; } function f(arg: number, arg: string) {} ``` **Valid:** ```typescript var a = 3; function f() { var a = 10; } if (foo) { let b = 2; } else { let b = 3; } ``` --- # lint/rules/no-regex-spaces.md URL: https://docs.deno.com/lint/rules/no-regex-spaces Disallows multiple spaces in regular expression literals. Multiple spaces in regular expression literals are generally hard to read when the regex gets complicated. Instead, it's better to use only one space character and specify how many times spaces should appear with the `{n}` syntax, for example: ```typescript // Multiple spaces in the regex literal are harder to understand how many // spaces are expected to be matched const re = /foo bar/; // Instead use `{n}` syntax for readability const re = /foo {3}var/; ``` **Invalid:** ```typescript const re1 = / /; const re2 = /foo bar/; const re3 = / a b c d /; const re4 = /foo {3}bar/; const re5 = new RegExp(" "); const re6 = new RegExp("foo bar"); const re7 = new RegExp(" a b c d "); const re8 = new RegExp("foo {3}bar"); ``` **Valid:** ```typescript const re1 = /foo/; const re2 = / /; const re3 = / {3}/; const re4 = / +/; const re5 = / ?/; const re6 = / */; const re7 = new RegExp("foo"); const re8 = new RegExp(" "); const re9 = new RegExp(" {3}"); const re10 = new RegExp(" +"); const re11 = new RegExp(" ?"); const re12 = new RegExp(" *"); ``` --- # lint/rules/no-self-assign.md URL: https://docs.deno.com/lint/rules/no-self-assign Disallows self assignments. Self assignments like `a = a;` have no effect at all. If there are self assignments in the code, most likely it means that the author is still in the process of refactoring and there's remaining work they have to do. **Invalid:** ```typescript a = a; [a] = [a]; [a, b] = [a, b]; [a, b] = [a, c]; [a, ...b] = [a, ...b]; a.b = a.b; ``` **Valid:** ```typescript let a = a; a += a; a = [a]; [a, b] = [b, a]; a.b = a.c; ``` --- # lint/rules/no-self-compare.md URL: https://docs.deno.com/lint/rules/no-self-compare Disallows comparisons where both sides are exactly the same. Comparing a variable or value against itself is usually an error, either a typo or refactoring error. It is confusing to the reader and may potentially introduce a runtime error. **Invalid:** ```typescript if (x === x) { } if ("x" === "x") { } if (a.b === a.b) { } if (a["b"] === a["b"]) { } ``` **Valid:** ```typescript if (x === y) { } if ("x" === "y") { } if (a.b === a.c) { } if (a["b"] === a["c"]) { } ``` --- # lint/rules/no-setter-return.md URL: https://docs.deno.com/lint/rules/no-setter-return Disallows returning values from setters. Setters are supposed to be used for setting some value to the property, which means that returning a value from a setter makes no sense. In fact, returned values are ignored and cannot ever be used at all although returning a value from a setter produces no error. This is why static check for this mistake by the linter is quite beneficial. Note that returning without a value is allowed; this is a useful technique to do early-return from a function. **Invalid:** ```typescript const a = { set foo(x: number) { return "something"; }, }; class B { private set foo(x: number) { return "something"; } } const c = { set foo(x: boolean) { if (x) { return 42; } }, }; ``` **Valid:** ```typescript // return without a value is allowed since it is used to do early-return const a = { set foo(x: number) { if (x % 2 == 0) { return; } }, }; // not a setter, but a getter class B { get foo() { return 42; } } // not a setter const c = { set(x: number) { return "something"; }, }; ``` --- # lint/rules/no-shadow-restricted-names.md URL: https://docs.deno.com/lint/rules/no-shadow-restricted-names Disallows shadowing of restricted names. The following (a) properties of the global object, or (b) identifiers are "restricted" names in JavaScript: - [`NaN`] - [`Infinity`] - [`undefined`] - [`eval`] - [`arguments`] These names are _NOT_ reserved in JavaScript, which means that nothing prevents one from assigning other values into them (i.e. shadowing). In other words, you are allowed to use, say, `undefined` as an identifier or variable name. (For more details see [MDN]) [`NaN`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/NaN [`Infinity`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Infinity [`undefined`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/undefined [`eval`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/eval [`arguments`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/arguments [MDN]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/undefined#description ```typescript function foo() { const undefined = "bar"; console.log(undefined); // output: "bar" } ``` Of course, shadowing like this most likely confuse other developers and should be avoided. This lint rule detects and warn them. **Invalid:** ```typescript const undefined = 42; function NaN() {} function foo(Infinity) {} const arguments = () => {}; try { } catch (eval) {} ``` **Valid:** ```typescript // If not assigned a value, `undefined` may be shadowed const undefined; const Object = 42; function foo(a: number, b: string) {} try { } catch (e) {} ``` --- # lint/rules/no-sloppy-imports.md URL: https://docs.deno.com/lint/rules/no-sloppy-imports Enforces specifying explicit references to paths in module specifiers. Non-explicit specifiers are ambiguous and require probing for the correct file path on every run, which has a performance overhead. Note: This lint rule is only active when using `--unstable-sloppy-imports`. ### Invalid: ```typescript import { add } from "./math/add"; import { ConsoleLogger } from "./loggers"; ``` ### Valid: ```typescript import { add } from "./math/add.ts"; import { ConsoleLogger } from "./loggers/index.ts"; ``` --- # lint/rules/no-slow-types.md URL: https://docs.deno.com/lint/rules/no-slow-types Enforces using types that are explicit or can be simply inferred. Read more: https://jsr.io/docs/about-slow-types --- # lint/rules/no-sparse-arrays.md URL: https://docs.deno.com/lint/rules/no-sparse-arrays Disallows sparse arrays. Sparse arrays are arrays that contain _empty slots_, which later could be handled either as `undefined` value or skipped by array methods, and this may lead to unexpected behavior: ```typescript [1, , 2].join(); // => '1,,2' [1, undefined, 2].join(); // => '1,,2' [1, , 2].flatMap((item) => item); // => [1, 2] [1, undefined, 2].flatMap((item) => item); // => [1, undefined, 2] ``` **Invalid:** ```typescript const items = ["foo", , "bar"]; ``` **Valid:** ```typescript const items = ["foo", "bar"]; ``` --- # lint/rules/no-sync-fn-in-async-fn.md URL: https://docs.deno.com/lint/rules/no-sync-fn-in-async-fn Disallow sync function inside async function. Using sync functions like `Deno.readTextFileSync` blocks the deno event loop so it's not recommended to use it inside of an async function, because it stops progress of all other async tasks. **Invalid:** ```javascript async function foo() { Deno.readTextFileSync(""); } const fooFn = async function foo() { Deno.readTextFileSync(""); }; const fooFn = async () => { Deno.readTextFileSync(""); }; ``` **Valid:** ```javascript async function foo() { await Deno.readTextFile(""); } function foo() { Deno.readTextFileSync(""); } const fooFn = function foo() { Deno.readTextFileSync(""); }; const fooFn = () => { Deno.readTextFileSync(""); }; ``` --- # lint/rules/no-this-alias.md URL: https://docs.deno.com/lint/rules/no-this-alias Disallows assigning variables to `this`. In most cases, storing a reference to `this` in a variable could be avoided by using arrow functions properly, since they establish `this` based on the scope where the arrow function is defined. Let's take a look at a concrete example: ```typescript const obj = { count: 0, doSomethingLater() { setTimeout(function () { // this function executes on the global scope; `this` evalutes to `globalThis` this.count++; console.log(this.count); }, 300); }, }; obj.doSomethingLater(); // `NaN` is printed, because the property `count` is not in the global scope. ``` In the above example, `this` in the function passed to `setTimeout` evaluates to `globalThis`, which results in the expected value `1` not being printed. If you wanted to work around it without arrow functions, you would store a reference to `this` in another variable: ```typescript const obj = { count: 0, doSomethingLater() { const self = this; // store a reference to `this` in `self` setTimeout(function () { // use `self` instead of `this` self.count++; console.log(self.count); }, 300); }, }; obj.doSomethingLater(); // `1` is printed as expected ``` But in this case arrow functions come in handy. With arrow functions, the code becomes way clearer and easier to understand: ```typescript const obj = { count: 0, doSomethingLater() { setTimeout(() => { // pass an arrow function // `this` evaluates to `obj` here this.count++; console.log(this.count); }, 300); }, }; obj.doSomethingLater(); // `1` is printed as expected ``` This example is taken from [MDN](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/Arrow_functions). **Invalid:** ```typescript const self = this; function foo() { const self = this; } const bar = () => { const self = this; }; ``` **Valid:** ```typescript const self = "this"; const [foo] = this; ``` --- # lint/rules/no-this-before-super.md URL: https://docs.deno.com/lint/rules/no-this-before-super Disallows use of `this` or `super` before calling `super()` in constructors. The access to `this` or `super` before calling `super()` in the constructor of derived classes leads to [`ReferenceError`]. To prevent it, this lint rule checks if there are accesses to `this` or `super` before calling `super()` in constructors. [`ReferenceError`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ReferenceError **Invalid:** ```typescript class A extends B { constructor() { this.foo = 0; super(); } } class C extends D { constructor() { super.foo(); super(); } } ``` **Valid:** ```typescript class A extends B { constructor() { super(); this.foo = 0; } } class C extends D { constructor() { super(); super.foo(); } } class E { constructor() { this.foo = 0; } } ``` --- # lint/rules/no-throw-literal.md URL: https://docs.deno.com/lint/rules/no-throw-literal Disallow throwing literals as exceptions. It is considered good practice to only `throw` the `Error` object itself or an object using the `Error` object as base objects for user-defined exceptions. The fundamental benefit of `Error` objects is that they automatically keep track of where they were built and originated. **Invalid:** ```typescript throw "error"; throw 0; throw undefined; throw null; ``` **Valid:** ```typescript throw new Error("error"); ``` --- # lint/rules/no-top-level-await.md URL: https://docs.deno.com/lint/rules/no-top-level-await Disallows the use of top level await expressions. Top level await cannot be used when distributing CommonJS/UMD via dnt. **Invalid:** ```typescript await foo(); for await (item of items) {} ``` **Valid:** ```typescript async function foo() { await task(); } async function foo() { for await (item of items) {} } ``` --- # lint/rules/no-undef.md URL: https://docs.deno.com/lint/rules/no-undef Disallow the use of undeclared variables. **Invalid:** ```typescript const foo = someFunction(); const bar = a + 1; ``` --- # lint/rules/no-unreachable.md URL: https://docs.deno.com/lint/rules/no-unreachable Disallows the unreachable code after the control flow statements. Because the control flow statements (`return`, `throw`, `break` and `continue`) unconditionally exit a block of code, any statements after them cannot be executed. **Invalid:** ```typescript function foo() { return true; console.log("done"); } ``` ```typescript function bar() { throw new Error("Oops!"); console.log("done"); } ``` ```typescript while (value) { break; console.log("done"); } ``` ```typescript throw new Error("Oops!"); console.log("done"); ``` ```typescript function baz() { if (Math.random() < 0.5) { return; } else { throw new Error(); } console.log("done"); } ``` ```typescript for (;;) {} console.log("done"); ``` **Valid:** ```typescript function foo() { return bar(); function bar() { return 1; } } ``` --- # lint/rules/no-unsafe-finally.md URL: https://docs.deno.com/lint/rules/no-unsafe-finally Disallows the use of control flow statements within `finally` blocks. Use of the control flow statements (`return`, `throw`, `break` and `continue`) overrides the usage of any control flow statements that might have been used in the `try` or `catch` blocks, which is usually not the desired behaviour. **Invalid:** ```typescript let foo = function () { try { return 1; } catch (err) { return 2; } finally { return 3; } }; ``` ```typescript let foo = function () { try { return 1; } catch (err) { return 2; } finally { throw new Error(); } }; ``` **Valid:** ```typescript let foo = function () { try { return 1; } catch (err) { return 2; } finally { console.log("hola!"); } }; ``` --- # lint/rules/no-unsafe-negation.md URL: https://docs.deno.com/lint/rules/no-unsafe-negation Disallows the usage of negation operator `!` as the left operand of relational operators. `!` operators appearing in the left operand of the following operators will sometimes cause an unexpected behavior because of the operator precedence: - `in` operator - `instanceof` operator For example, when developers write a code like `!key in someObject`, most likely they want it to behave just like `!(key in someObject)`, but actually it behaves like `(!key) in someObject`. This lint rule warns such usage of `!` operator so it will be less confusing. **Invalid:** ```typescript if (!key in object) {} if (!foo instanceof Foo) {} ``` **Valid:** ```typescript if (!(key in object)) {} if (!(foo instanceof Foo)) {} if ((!key) in object) {} if ((!foo) instanceof Foo) {} ``` --- # lint/rules/no-unused-labels.md URL: https://docs.deno.com/lint/rules/no-unused-labels Disallows unused labels. A label that is declared but never used is most likely developer's mistake. If that label is meant to be used, then write a code so that it will be used. Otherwise, remove the label. **Invalid:** ```typescript LABEL1: while (true) { console.log(42); } LABEL2: for (let i = 0; i < 5; i++) { console.log(42); } LABEL3: for (const x of xs) { console.log(x); } ``` **Valid:** ```typescript LABEL1: while (true) { console.log(42); break LABEL1; } LABEL2: for (let i = 0; i < 5; i++) { console.log(42); continue LABEL2; } for (const x of xs) { console.log(x); } ``` --- # lint/rules/no-unused-vars.md URL: https://docs.deno.com/lint/rules/no-unused-vars Enforces all variables are used at least once. If there are variables that are declared but not used anywhere, it's most likely because of incomplete refactoring. This lint rule detects and warns such unused variables. Variable `a` is considered to be "used" if any of the following conditions are satisfied: - its value is read out, like `console.log(a)` or `let otherVariable = a;` - it's called or constructed, like `a()` or `new a()` - it's exported, like `export const a = 42;` If a variable is just assigned to a value but never read out, then it's considered to be _"not used"_. ```typescript let a; a = 42; // `a` is never read out ``` If you want to declare unused variables intentionally, prefix them with the underscore character `_`, like `_a`. This rule ignores variables that are prefixed with `_`. **Invalid:** ```typescript const a = 0; const b = 0; // this `b` is never used function foo() { const b = 1; // this `b` is used console.log(b); } foo(); let c = 2; c = 3; // recursive function calls are not considered to be used, because only when `d` // is called from outside the function body can we say that `d` is actually // called after all. function d() { d(); } // `x` is never used export function e(x: number): number { return 42; } const f = "unused variable"; ``` **Valid:** ```typescript const a = 0; console.log(a); const b = 0; function foo() { const b = 1; console.log(b); } foo(); console.log(b); let c = 2; c = 3; console.log(c); function d() { d(); } d(); export function e(x: number): number { return x + 42; } export const f = "exported variable"; ``` --- # lint/rules/no-useless-rename.md URL: https://docs.deno.com/lint/rules/no-useless-rename Disallow useless rename operations where both the original and new name are exactly the same. This is often a leftover from a refactoring procedure and can be safely removed. **Invalid:** ```ts import { foo as foo } from "foo"; const { foo: foo } = obj; export { foo as foo }; ``` **Valid:** ```ts import { foo as bar } from "foo"; const { foo: bar } = obj; export { foo as bar }; ``` --- # lint/rules/no-var.md URL: https://docs.deno.com/lint/rules/no-var Enforces the use of block scoped variables over more error prone function scoped variables. Block scoped variables are defined using `const` and `let` keywords. `const` and `let` keywords ensure the variables defined using these keywords are not accessible outside their block scope. On the other hand, variables defined using `var` keyword are only limited by their function scope. **Invalid:** ```typescript var foo = "bar"; ``` **Valid:** ```typescript const foo = 1; let bar = 2; ``` --- # lint/rules/no-window-prefix.md URL: https://docs.deno.com/lint/rules/no-window-prefix Disallows the use of Web APIs via the `window` object. In most situations, the global variable `window` works like `globalThis`. For example, you could call the `fetch` API like `window.fetch(..)` instead of `fetch(..)` or `globalThis.fetch(..)`. In Web Workers, however, `window` is not available, but instead `self`, `globalThis`, or no prefix work fine. Therefore, for compatibility between Web Workers and other contexts, it's highly recommended to not access global properties via `window`. Some APIs, including `window.alert`, `window.location` and `window.history`, are allowed to call with `window` because these APIs are not supported or have different meanings in Workers. In other words, this lint rule complains about the use of `window` only if it's completely replaceable with `self`, `globalThis`, or no prefix. **Invalid:** ```typescript const a = await window.fetch("https://deno.land"); const b = window.Deno.metrics(); ``` **Valid:** ```typescript const a1 = await fetch("https://deno.land"); const a2 = await globalThis.fetch("https://deno.land"); const a3 = await self.fetch("https://deno.land"); const b1 = Deno.metrics(); const b2 = globalThis.Deno.metrics(); const b3 = self.Deno.metrics(); // `alert` is allowed to call with `window` because it's not supported in Workers window.alert("🍣"); // `location` is also allowed window.location.host; ``` --- # lint/rules/no-window.md URL: https://docs.deno.com/lint/rules/no-window Disallows the use of the `window` object. The `window` global is no longer available in Deno. Deno does not have a window and `typeof window === "undefined"` is often used to tell if the code is running in the browser. **Invalid:** ```typescript const a = await window.fetch("https://deno.land"); const b = window.Deno.metrics(); console.log(window); window.addEventListener("load", () => { console.log("Loaded."); }); ``` **Valid:** ```typescript const a1 = await fetch("https://deno.land"); const a2 = await globalThis.fetch("https://deno.land"); const a3 = await self.fetch("https://deno.land"); const b1 = Deno.metrics(); const b2 = globalThis.Deno.metrics(); const b3 = self.Deno.metrics(); console.log(globalThis); addEventListener("load", () => { console.log("Loaded."); }); ``` --- # lint/rules/no-with.md URL: https://docs.deno.com/lint/rules/no-with Disallows the usage of `with` statements. The `with` statement is discouraged as it may be the source of confusing bugs and compatibility issues. For more details, see [with - JavaScript | MDN]. [with - JavaScript | MDN]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/with **Invalid:** ```typescript with (someVar) { console.log("foo"); } ``` --- # lint/rules/prefer-as-const.md URL: https://docs.deno.com/lint/rules/prefer-as-const Recommends using const assertion (`as const`) over explicitly specifying literal types or using type assertion. When declaring a new variable of a primitive literal type, there are three ways: 1. adding an explicit type annotation 2. using normal type assertion (like `as "foo"`, or `<"foo">`) 3. using const assertion (`as const`) This lint rule suggests using const assertion because it will generally lead to a safer code. For more details about const assertion, see [the official handbook](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-4.html#const-assertions). **Invalid:** ```typescript let a: 2 = 2; // type annotation let b = 2 as 2; // type assertion let c = <2> 2; // type assertion let d = { foo: 1 as 1 }; // type assertion ``` **Valid:** ```typescript let a = 2 as const; let b = 2 as const; let c = 2 as const; let d = { foo: 1 as const }; let x = 2; let y: string = "hello"; let z: number = someVariable; ``` --- # lint/rules/prefer-ascii.md URL: https://docs.deno.com/lint/rules/prefer-ascii Ensures that the code is fully written in ASCII characters. V8, the JavaScript engine Deno relies on, provides a method that strings get populated outside V8's heap. In particular, if they are composed of one-byte characters only, V8 can handle them much more efficiently through [`v8::String::ExternalOneByteStringResource`]. In order to leverage this V8 feature in the internal of Deno, this rule checks if all characters in the code are ASCII. [`v8::String::ExternalOneByteStringResource`]: https://v8.github.io/api/head/classv8_1_1String_1_1ExternalOneByteStringResource.html That said, you can also make use of this lint rule for something other than Deno's internal JavaScript code. If you want to make sure your codebase is made up of ASCII characters only (e.g. want to disallow non-ASCII identifiers) for some reasons, then this rule will be helpful. **Invalid:** ```typescript const π = Math.PI; // string literals are also checked const ninja = "🥷"; function こんにちは(名前: string) { console.log(`こんにちは、${名前}さん`); } // “comments” are also checked // ^ ^ // | U+201D // U+201C ``` **Valid:** ```typescript const pi = Math.PI; const ninja = "ninja"; function hello(name: string) { console.log(`Hello, ${name}`); } // "comments" are also checked ``` --- # lint/rules/prefer-const.md URL: https://docs.deno.com/lint/rules/prefer-const Recommends declaring variables with [`const`] over [`let`]. Since ES2015, JavaScript supports [`let`] and [`const`] for declaring variables. If variables are declared with [`let`], then they become mutable; we can set other values to them afterwards. Meanwhile, if declared with [`const`], they are immutable; we cannot perform re-assignment to them. In general, to make the codebase more robust, maintainable, and readable, it is highly recommended to use [`const`] instead of [`let`] wherever possible. The fewer mutable variables are, the easier it should be to keep track of the variable states while reading through the code, and thus it is less likely to write buggy code. So this lint rule checks if there are [`let`] variables that could potentially be declared with [`const`] instead. Note that this rule does not check for [`var`] variables. Instead, [the `no-var` rule](/lint/rules/no-var) is responsible for detecting and warning [`var`] variables. [`let`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/let [`const`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/const [`var`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/var **Invalid:** ```typescript let a = 0; let b = 0; someOperation(b); // `const` could be used instead for (let c in someObject) {} // `const` could be used instead for (let d of someArray) {} // variable that is uninitialized at first and then assigned in the same scope is NOT allowed // because we could simply write it like `const e = 2;` instead let e; e = 2; ``` **Valid:** ```typescript // uninitialized variable is allowed let a; let b = 0; b += 1; let c = 0; c = 1; // variable that is uninitialized at first and then assigned in the same scope _two or more times_ is allowed // because we cannot represent it with `const` let d; d = 2; d = 3; const e = 0; // `f` is mutated through `f++` for (let f = 0; f < someArray.length; f++) {} // variable that is initialized (or assigned) in another scope is allowed let g; function func1() { g = 42; } // conditionally initialized variable is allowed let h; if (trueOrFalse) { h = 0; } ``` --- # lint/rules/prefer-namespace-keyword.md URL: https://docs.deno.com/lint/rules/prefer-namespace-keyword Recommends the use of `namespace` keyword over `module` keyword when declaring TypeScript module. TypeScript supports the `module` keyword for organizing code, but this wording can lead to a confusion with the ECMAScript's module. Since TypeScript v1.5, it has provided us with the alternative keyword `namespace`, encouraging us to always use `namespace` instead whenever we write TypeScript these days. See [TypeScript v1.5 release note](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-1-5.html#namespace-keyword) for more details. **Invalid:** ```typescript module modA {} declare module modB {} ``` **Valid:** ```typescript namespace modA {} // "ambient modules" are allowed // https://www.typescriptlang.org/docs/handbook/modules.html#ambient-modules declare module "modB"; declare module "modC" {} ``` --- # lint/rules/prefer-primordials.md URL: https://docs.deno.com/lint/rules/prefer-primordials Suggests using frozen intrinsics from `primordials` rather than the default globals. This lint rule is designed to be dedicated to Deno's internal code. Normal users don't have to run this rule for their code. Primordials are a frozen set of all intrinsic objects in the runtime, which we should use in the Deno's internal to avoid the risk of prototype pollution. This rule detects the direct use of global intrinsics and suggests replacing it with the corresponding one from the `primordials` object. One such example is: ```javascript const arr = getSomeArrayOfNumbers(); const evens = arr.filter((val) => val % 2 === 0); ``` The second line of this example should be: ```javascript const evens = primordials.ArrayPrototypeFilter(arr, (val) => val % 2 === 0); ``` **Invalid:** ```javascript const arr = new Array(); const s = JSON.stringify({}); const i = parseInt("42"); const { ownKeys } = Reflect; ``` **Valid:** ```javascript const { Array } = primordials; const arr = new Array(); const { JSONStringify } = primordials; const s = JSONStringify({}); const { NumberParseInt } = primordials; const i = NumberParseInt("42"); const { ReflectOwnKeys } = primordials; ``` --- # lint/rules/react-no-danger-with-children.md URL: https://docs.deno.com/lint/rules/react-no-danger-with-children Using JSX children together with `dangerouslySetInnerHTML` is invalid as they will be ignored. **Invalid:** ```tsx
    hello" }}>

    this will never be rendered

    ; ``` **Valid:** ```tsx
    hello" }} />; ``` --- # lint/rules/react-no-danger.md URL: https://docs.deno.com/lint/rules/react-no-danger Prevent the use of `dangerouslySetInnerHTML` which can lead to XSS vulnerabilities if used incorrectly. **Invalid:** ```tsx const hello =
    ; ``` **Valid:** ```tsx const hello =
    Hello World!
    ; ``` --- # lint/rules/react-rules-of-hooks.md URL: https://docs.deno.com/lint/rules/react-rules-of-hooks Ensure that hooks are called correctly in React/Preact components. They must be called at the top level of a component and not inside a conditional statement or a loop. **Invalid:** ```tsx // WRONG: Called inside condition function MyComponent() { if (condition) { const [count, setCount] = useState(0); } } // WRONG: Called inside for loop function MyComponent() { for (const item of items) { const [count, setCount] = useState(0); } } // WRONG: Called inside while loop function MyComponent() { while (condition) { const [count, setCount] = useState(0); } } // WRONG: Called after condition function MyComponent() { if (condition) { // ... } const [count, setCount] = useState(0); } ``` **Valid:** ```tsx function MyComponent() { const [count, setCount] = useState(0); } ``` --- # lint/rules/require-await.md URL: https://docs.deno.com/lint/rules/require-await Disallows async functions that have no await expression or await using declaration. In general, the primary reason to use async functions is to use await expressions or await using declarations inside. If an async function has neither, it is most likely an unintentional mistake. **Invalid:** ```typescript async function f1() { doSomething(); } const f2 = async () => { doSomething(); }; const f3 = async () => doSomething(); const obj = { async method() { doSomething(); }, }; class MyClass { async method() { doSomething(); } } ``` **Valid:** ```typescript await asyncFunction(); function normalFunction() { doSomething(); } async function f1() { await asyncFunction(); } const f2 = async () => { await asyncFunction(); }; const f3 = async () => await asyncFunction(); async function f4() { for await (const num of asyncIterable) { console.log(num); } } async function f5() { using = createResource(); } // empty functions are valid async function emptyFunction() {} const emptyArrowFunction = async () => {}; // generators are also valid async function* gen() { console.log(42); } ``` --- # lint/rules/require-yield.md URL: https://docs.deno.com/lint/rules/require-yield Disallows generator functions that have no `yield`. JavaScript provides generator functions expressed as `function*`, where we can pause and later resume the function execution at the middle points. At these points we use the `yield` keyword. In other words, it makes no sense at all to create generator functions that contain no `yield` keyword, since such functions could be written as normal functions. **Invalid:** ```typescript function* f1() { return "f1"; } ``` **Valid:** ```typescript function* f1() { yield "f1"; } // generator function with empty body is allowed function* f2() {} function f3() { return "f3"; } ``` --- # lint/rules/single-var-declarator.md URL: https://docs.deno.com/lint/rules/single-var-declarator Disallows multiple variable definitions in the same declaration statement. **Invalid:** ```typescript const foo = 1, bar = "2"; ``` **Valid:** ```typescript const foo = 1; const bar = "2"; ``` --- # lint/rules/triple-slash-reference.md URL: https://docs.deno.com/lint/rules/triple-slash-reference Disallow certain triple slash directives in favor of ES6-style import declarations. TypeScript's `///` triple-slash references are a way to indicate that types from another module are available in a file. Use of triple-slash reference type directives is generally discouraged in favor of ECMAScript Module imports. This rule reports on the use of `/// `, `/// `, or `/// ` directives. **Invalid:** ```typescript /// import * as foo from "foo"; ``` **Valid:** ```typescript import * as foo from "foo"; ``` --- # lint/rules/use-isnan.md URL: https://docs.deno.com/lint/rules/use-isnan Disallows comparisons to `NaN`. Because `NaN` is unique in JavaScript by not being equal to anything, including itself, the results of comparisons to `NaN` are confusing: - `NaN === NaN` or `NaN == NaN` evaluate to `false` - `NaN !== NaN` or `NaN != NaN` evaluate to `true` Therefore, this rule makes you use the `isNaN()` or `Number.isNaN()` to judge the value is `NaN` or not. **Invalid:** ```typescript if (foo == NaN) { // ... } if (foo != NaN) { // ... } switch (NaN) { case foo: // ... } switch (foo) { case NaN: // ... } ``` **Valid:** ```typescript if (isNaN(foo)) { // ... } if (!isNaN(foo)) { // ... } ``` --- # lint/rules/valid-typeof.md URL: https://docs.deno.com/lint/rules/valid-typeof Restricts the use of the `typeof` operator to a specific set of string literals. When used with a value the `typeof` operator returns one of the following strings: - `"undefined"` - `"object"` - `"boolean"` - `"number"` - `"string"` - `"function"` - `"symbol"` - `"bigint"` This rule disallows comparison with anything other than one of these string literals when using the `typeof` operator, as this likely represents a typing mistake in the string. The rule also disallows comparing the result of a `typeof` operation with any non-string literal value, such as `undefined`, which can represent an inadvertent use of a keyword instead of a string. This includes comparing against string variables even if they contain one of the above values as this cannot be guaranteed. An exception to this is comparing the results of two `typeof` operations as these are both guaranteed to return on of the above strings. **Invalid:** ```typescript // typo typeof foo === "strnig"; typeof foo == "undefimed"; typeof bar != "nunber"; typeof bar !== "fucntion"; // compare with non-string literals typeof foo === undefined; typeof bar == Object; typeof baz === anotherVariable; typeof foo == 5; ``` **Valid:** ```typescript typeof foo === "undefined"; typeof bar == "object"; typeof baz === "string"; typeof bar === typeof qux; ``` --- # lint/rules/verbatim-module-syntax.md URL: https://docs.deno.com/lint/rules/verbatim-module-syntax Enforces type imports to be declared as type imports. This rule ensures that the code works when the `verbatimModuleSyntax` TypeScript compiler option is enabled. This is useful in libraries distributing TypeScript code in order to work in more scenarios. **Invalid:** ```typescript import { Person } from "./person.ts"; const person: Person = { name: "David", }; console.log(person); ``` ```typescript import { output, Person } from "./person.ts"; const person: Person = { name: "David", }; output(person); ``` **Valid:** ```typescript import type { Person } from "./person.ts"; const person: Person = { name: "David", }; console.log(person); ``` ```typescript import { output, type Person } from "./person.ts"; const person: Person = { name: "David", }; output(person); ``` --- # Architecture Overview > Deep dive into Deno's internal architecture, explaining core components like the runtime, compiler, and security sandbox. Learn how Deno processes requests and executes JavaScript/TypeScript code. URL: https://docs.deno.com/runtime/contributing/architecture ## Deno and Linux analogy | **Linux** | **Deno** | | ------------------------------: | :--------------------------------- | | Processes | Web Workers | | Syscalls | Ops | | File descriptors (fd) | [Resource ids (rid)](#resources) | | Scheduler | Tokio | | Userland: libc++ / glib / boost | https://jsr.io/@std | | /proc/\$\$/stat | [Deno.metrics()](#metrics) | | man pages | deno types / https://docs.deno.com | ### Resources Resources (AKA `rid`) are Deno's version of file descriptors. They are integer values used to refer to open files, sockets, and other concepts. For testing it would be good to be able to query the system for how many open resources there are. ```ts console.log(Deno.resources()); // { 0: "stdin", 1: "stdout", 2: "stderr" } Deno.close(0); console.log(Deno.resources()); // { 1: "stdout", 2: "stderr" } ``` ### Metrics Metrics is Deno's internal counter for various statistics. ```shell > console.table(Deno.metrics()) ┌─────────────────────────┬───────────┐ │ (idx) │ Values │ ├─────────────────────────┼───────────┤ │ opsDispatched │ 9 │ │ opsDispatchedSync │ 0 │ │ opsDispatchedAsync │ 0 │ │ opsDispatchedAsyncUnref │ 0 │ │ opsCompleted │ 9 │ │ opsCompletedSync │ 0 │ │ opsCompletedAsync │ 0 │ │ opsCompletedAsyncUnref │ 0 │ │ bytesSentControl │ 504 │ │ bytesSentData │ 0 │ │ bytesReceived │ 856 │ └─────────────────────────┴───────────┘ ``` ## Conference - Ryan Dahl. (May 27, 2020). [An interesting case with Deno](https://www.youtube.com/watch?v=1b7FoBwxc7E). Deno Israel. - Bartek Iwańczuk. (Oct 6, 2020). [Deno internals - how modern JS/TS runtime is built](https://www.youtube.com/watch?v=AOvg_GbnsbA&t=35m13s). Paris Deno. --- # Documentation guidelines > Guide for contributing to Deno's documentation. Learn our documentation standards, writing style, and how to submit documentation changes. URL: https://docs.deno.com/runtime/contributing/docs We welcome and appreciate contributions to the Deno documentation. If you find an issue, or want to add to the docs, each page has an "Edit this page" button at the bottom of the page. Clicking this button will take you to the source file for that page in the [Deno docs repository](https://github.com/denoland/docs/). You can then make your changes and submit a pull request. Some pages in the Deno documentation are generated from source files in the Deno repository. These pages are not directly editable: - The [API reference](/api/deno/) pages are generated from type definitions in the Deno repository. - The [CLI reference](/runtime/reference/cli/) pages for each individual command are generated from source files in the Deno repository. If you find an issue with one of these pages, you can either submit a pull request to the Deno repository. Or raise an issue in the [Deno docs repository](https://github.com/denoland/docs/issues) and we'll get it fixed. ## Running the docs locally You can fork and clone the entire [Deno docs repository](https://github.com/denoland/docs) to your local machine and run the docs locally. This is useful if you want to see how your changes will look before submitting a pull request. 1. Fork the [Deno docs repository](https://github.com/denoland/docs). 2. Clone your fork to your local machine with `git clone`. 3. Change directory into the `docs` directory you just cloned. 4. Run the docs repo locally with `deno task serve`. 5. Open your browser and navigate to `http://localhost:3000`. 6. Optionally, generate the API documentation with `deno task reference`. To see a more detailed description of available tasks, check out the [Deno docs README](https://github.com/denoland/docs?tab=readme-ov-file#deno-docs) --- # Contributing an example > Learn how to create and contribute meaningful examples to the Deno docs. URL: https://docs.deno.com/runtime/contributing/examples [Deno by Example](/examples/) is a collection of examples that demonstrate how to use Deno and its APIs. If you contribute an example, we'll send you a free pack of stickers! ![Deno stickers laid out on a table](./images/stickers.jpg) ## Contributing an example If you have a Deno example that you would like to share with the community, you can contribute it to the [Deno docs repository](https://github.com/denoland/docs?tab=readme-ov-file#examples) or make an issue if there's an example you'd like to see. If your example is merged, we'll credit you as the author and send you some awesome special edition Deno stickers so that you can show off your contributor status as a token of our appreciation. ## Getting your stickers If you've contributed an example, drop us an email at [docs@deno.com](mailto:docs@deno.com) and let us know so we can get your stickers out to you! --- # Contributing and support > Guide to contributing to the Deno project and ecosystem. Learn about different Deno repositories, contribution guidelines, and how to submit effective pull requests. URL: https://docs.deno.com/runtime/contributing/ We welcome and appreciate all contributions to Deno. This page serves as a helper to get you started on contributing. ## Projects There are numerous repositories in the [`denoland`](https://github.com/denoland) organization that are part of the Deno ecosystem. Repositories have different scopes, use different programming languages and have varying difficulty level when it comes to contributions. To help you decide which repository might be the best to start contributing (and/or falls into your interest), here's a short comparison (**codebases primarily comprise the languages in bold**): ### [deno](https://github.com/denoland/deno) This is the main repository that provides the `deno` CLI. Languages: **Rust**, **JavaScript**, **TypeScript** ### [deno_std](https://github.com/denoland/deno_std) The standard library for Deno. Languages: **TypeScript**, WebAssembly ### [fresh](https://github.com/denoland/fresh) The next-gen web framework. Languages: **TypeScript**, TSX ### [deno_lint](https://github.com/denoland/deno_lint) Linter that powers `deno lint` subcommand. Languages: **Rust** ### [deno_doc](https://github.com/denoland/deno_doc) Documentation generator that powers `deno doc` subcommand, and reference documentation on https://docs.deno.com/api, and https://jsr.io. Languages: **Rust** ### [rusty_v8](https://github.com/denoland/rusty_v8) Rust bindings for the V8 JavaScript engine. Very technical and low-level. Languages: **Rust** ### [serde_v8](https://github.com/denoland/deno_core/tree/main/serde_v8) Library that provides bijection layer between V8 and Rust objects. Based on [`serde`](https://crates.io/crates/serde) library. Very technical and low-level. Languages: **Rust** ### [deno_docker](https://github.com/denoland/deno_docker) Official Docker images for Deno. ## General remarks - Read the [style guide](/runtime/contributing/style_guide). - Please don't make [the benchmarks](https://deno.land/benchmarks) worse. - Ask for help in the [community chat room](https://discord.gg/deno). - If you are going to work on an issue, mention so in the issue's comments _before_ you start working on the issue. - If you are going to work on a new feature, create an issue and discuss with other contributors _before_ you start working on the feature; we appreciate all contributions but not all proposed features will be accepted. We don't want you to spend hours working on code that might not be accepted. - Please be professional in the forums. We follow [Rust's code of conduct](https://www.rust-lang.org/policies/code-of-conduct) (CoC). Have a problem? Email [ry@tinyclouds.org](mailto:ry@tinyclouds.org). ## Submitting a pull request Before submitting a PR to any of the repos, please make sure the following is done: 1. Give the PR a descriptive title. Examples of good PR title: - fix(std/http): Fix race condition in server - docs(console): Update docstrings - feat(doc): Handle nested re-exports Examples of bad PR title: - fix #7123 - update docs - fix bugs 2. Ensure there is a related issue and that it is referenced in the PR text. 3. Ensure there are tests that cover the changes. ## Documenting APIs It is important to document all public APIs and we want to do that inline with the code. This helps ensure that code and documentation are tightly coupled together. ### JavaScript and TypeScript All publicly exposed APIs and types, both via the `deno` module as well as the global/`window` namespace should have JSDoc documentation. This documentation is parsed and available to the TypeScript compiler, and therefore easy to provide further downstream. JSDoc blocks come just prior to the statement they apply to and are denoted by a leading `/**` before terminating with a `*/`. For example: ```ts /** A simple JSDoc comment */ export const FOO = "foo"; ``` Find more at: https://jsdoc.app/ ### Rust Use [this guide](https://doc.rust-lang.org/rustdoc/how-to-write-documentation.html) for writing documentation comments in Rust code. ## Profiling When contributing to performance-sensitive parts of the codebase, it's helpful to profile your changes to ensure they don't negatively impact performance or to verify your optimizations are effective. ### Using Samply [Samply](https://github.com/mstange/samply) is a sampling profiler for macOS and Linux that works well with Deno. It produces flamegraphs that help you visualize where CPU time is being spent. ```sh # Basic usage samply record -r 20000 deno run -A main.js ``` You can analyze the generated flamegraph to identify: - Hot spots where most CPU time is spent - Unexpected function calls - Potential areas for optimization When submitting performance-related contributions, including profiling data can help the team to understand and validate your improvements. --- # Release Schedule > Overview of Deno's release cycle and versioning process. Learn about stable releases, canary builds, and how to manage different Deno versions including upgrading to specific builds. URL: https://docs.deno.com/runtime/contributing/release_schedule A new minor release for the `deno` cli is scheduled for release every 12 weeks. See [Milestones on Deno's GitHub](https://github.com/denoland/deno/milestones) for the upcoming releases. There are usually several patch releases (done weekly) after a minor release; after that a merge window for new features opens for the upcoming minor release. Stable releases can be found on the [GitHub releases page](https://github.com/denoland/deno/releases). ## Canary channel In addition to the stable channel described above, canaries are released multiple times daily (for each commit on main). You can upgrade to the latest canary release by running: ```console deno upgrade --canary ``` To update to a specific canary, pass the commit hash in the `--version` option: ```console deno upgrade --canary --version=973af61d8bb03c1709f61e456581d58386ed4952 ``` To switch back to the stable channel, run `deno upgrade`. Canaries can be downloaded from https://dl.deno.land. --- # Deno Style Guide > Comprehensive style guide for contributing to Deno's internal runtime code and standard library. Covers coding conventions, documentation standards, testing requirements, and best practices for TypeScript and Rust development. URL: https://docs.deno.com/runtime/contributing/style_guide :::note Note that this is the style guide for **internal runtime code** in the Deno runtime, and in the Deno Standard Library. This is not meant as a general style guide for users of Deno. ::: ### Copyright Headers Most modules in the repository should have the following copyright header: ```ts // Copyright 2018-2024 the Deno authors. All rights reserved. MIT license. ``` If the code originates elsewhere, ensure that the file has the proper copyright headers. We only allow MIT, BSD, and Apache licensed code. ### Use underscores, not dashes in filenames Example: Use `file_server.ts` instead of `file-server.ts`. ### Add tests for new features Each module should contain or be accompanied by tests for its public functionality. ### TODO Comments TODO comments should usually include an issue or the author's github username in parentheses. Example: ```ts // TODO(ry): Add tests. // TODO(#123): Support Windows. // FIXME(#349): Sometimes panics. ``` ### Meta-programming is discouraged. Including the use of Proxy Be explicit, even when it means more code. There are some situations where it may make sense to use such techniques, but in the vast majority of cases it does not. ### Inclusive code Please follow the guidelines for inclusive code outlined at https://chromium.googlesource.com/chromium/src/+/HEAD/styleguide/inclusive_code.md. ### Rust Follow Rust conventions and be consistent with existing code. ### TypeScript The TypeScript portion of the code base is the standard library `std`. #### Use TypeScript instead of JavaScript #### Do not use the filename `index.ts`/`index.js` Deno does not treat "index.js" or "index.ts" in a special way. By using these filenames, it suggests that they can be left out of the module specifier when they cannot. This is confusing. If a directory of code needs a default entry point, use the filename `mod.ts`. The filename `mod.ts` follows Rust's convention, is shorter than `index.ts`, and doesn't come with any preconceived notions about how it might work. #### Exported functions: max 2 args, put the rest into an options object When designing function interfaces, stick to the following rules. 1. A function that is part of the public API takes 0-2 required arguments, plus (if necessary) an options object (so max 3 total). 2. Optional parameters should generally go into the options object. An optional parameter that's not in an options object might be acceptable if there is only one, and it seems inconceivable that we would add more optional parameters in the future. 3. The 'options' argument is the only argument that is a regular 'Object'. Other arguments can be objects, but they must be distinguishable from a 'plain' Object runtime, by having either: - a distinguishing prototype (e.g. `Array`, `Map`, `Date`, `class MyThing`). - a well-known symbol property (e.g. an iterable with `Symbol.iterator`). This allows the API to evolve in a backwards compatible way, even when the position of the options object changes. ```ts // BAD: optional parameters not part of options object. (#2) export function resolve( hostname: string, family?: "ipv4" | "ipv6", timeout?: number, ): IPAddress[] {} ``` ```ts // GOOD. export interface ResolveOptions { family?: "ipv4" | "ipv6"; timeout?: number; } export function resolve( hostname: string, options: ResolveOptions = {}, ): IPAddress[] {} ``` ```ts export interface Environment { [key: string]: string; } // BAD: `env` could be a regular Object and is therefore indistinguishable // from an options object. (#3) export function runShellWithEnv(cmdline: string, env: Environment): string {} // GOOD. export interface RunShellOptions { env: Environment; } export function runShellWithEnv( cmdline: string, options: RunShellOptions, ): string {} ``` ```ts // BAD: more than 3 arguments (#1), multiple optional parameters (#2). export function renameSync( oldname: string, newname: string, replaceExisting?: boolean, followLinks?: boolean, ) {} ``` ```ts // GOOD. interface RenameOptions { replaceExisting?: boolean; followLinks?: boolean; } export function renameSync( oldname: string, newname: string, options: RenameOptions = {}, ) {} ``` ```ts // BAD: too many arguments. (#1) export function pwrite( fd: number, buffer: ArrayBuffer, offset: number, length: number, position: number, ) {} ``` ```ts // BETTER. export interface PWrite { fd: number; buffer: ArrayBuffer; offset: number; length: number; position: number; } export function pwrite(options: PWrite) {} ``` Note: When one of the arguments is a function, you can adjust the order flexibly. See examples like [Deno.serve](https://docs.deno.com/api/deno/~/Deno.serve), [Deno.test](https://docs.deno.com/api/deno/~/Deno.test), [Deno.addSignalListener](https://docs.deno.com/api/deno/~/Deno.addSignalListener). See also [this post](https://twitter.com/jaffathecake/status/1646798390355697664). #### Export all interfaces that are used as parameters to an exported member Whenever you are using interfaces that are included in the parameters or return type of an exported member, you should export the interface that is used. Here is an example: ```ts // my_file.ts export interface Person { name: string; age: number; } export function createPerson(name: string, age: number): Person { return { name, age }; } // mod.ts export { createPerson } from "./my_file.ts"; export type { Person } from "./my_file.ts"; ``` #### Minimize dependencies; do not make circular imports Although `std` has no external dependencies, we must still be careful to keep internal dependencies simple and manageable. In particular, be careful not to introduce circular imports. #### If a filename starts with an underscore: `_foo.ts`, do not link to it There may be situations where an internal module is necessary but its API is not meant to be stable or linked to. In this case prefix it with an underscore. By convention, only files in its own directory should import it. #### Use JSDoc for exported symbols We strive for complete documentation. Every exported symbol ideally should have a documentation line. If possible, use a single line for the JSDoc. Example: ```ts /** foo does bar. */ export function foo() { // ... } ``` It is important that documentation is easily human-readable, but there is also a need to provide additional styling information to ensure generated documentation is more rich text. Therefore JSDoc should generally follow markdown markup to enrich the text. While markdown supports HTML tags, it is forbidden in JSDoc blocks. Code string literals should be braced with the back-tick (\`) instead of quotes. For example: ```ts /** Import something from the `deno` module. */ ``` Do not document function arguments unless they are non-obvious of their intent (though if they are non-obvious intent, the API should be considered anyways). Therefore `@param` should generally not be used. If `@param` is used, it should not include the `type` as TypeScript is already strongly-typed. ```ts /** * Function with non-obvious param. * @param foo Description of non-obvious parameter. */ ``` Vertical spacing should be minimized whenever possible. Therefore, single-line comments should be written as: ```ts /** This is a good single-line JSDoc. */ ``` And not: ```ts /** * This is a bad single-line JSDoc. */ ``` Code examples should utilize markdown format, like so: ````ts /** A straightforward comment and an example: * ```ts * import { foo } from "deno"; * foo("bar"); * ``` */ ```` Code examples should not contain additional comments and must not be indented. It is already inside a comment. If it needs further comments, it is not a good example. #### Resolve linting problems using directives Currently, the building process uses `dlint` to validate linting problems in the code. If the task requires code that is non-conformant to linter use `deno-lint-ignore ` directive to suppress the warning. ```typescript // deno-lint-ignore no-explicit-any let x: any; ``` This ensures the continuous integration process doesn't fail due to linting problems, but it should be used scarcely. #### Each module should come with a test module Every module with public functionality `foo.ts` should come with a test module `foo_test.ts`. A test for a `std` module should go in `std/tests` due to their different contexts; otherwise, it should just be a sibling to the tested module. #### Unit Tests should be explicit For a better understanding of the tests, function should be correctly named as it's prompted throughout the test command. Like: ```console foo() returns bar object ... ok ``` Example of test: ```ts import { assertEquals } from "@std/assert"; import { foo } from "./mod.ts"; Deno.test("foo() returns bar object", function () { assertEquals(foo(), { bar: "bar" }); }); ``` Note: See [tracking issue](https://github.com/denoland/deno_std/issues/3754) for more information. #### Top-level functions should not use arrow syntax Top-level functions should use the `function` keyword. Arrow syntax should be limited to closures. Bad: ```ts export const foo = (): string => { return "bar"; }; ``` Good: ```ts export function foo(): string { return "bar"; } ``` Regular functions and arrow functions have different behavior with respect to hoisting, binding, arguments, and constructability. The `function` keyword clearly indicates the intent to define a function, improving legibility and tracibility while debugging. #### Error Messages User-facing error messages raised from JavaScript / TypeScript should be clear, concise, and consistent. Error messages should be in sentence case but should not end with a period. Error messages should be free of grammatical errors and typos and written in American English. :::note Note that the error message style guide is a work in progress, and not all the error messages have been updated to conform to the current styles. ::: Error message styles that should be followed: 1. Messages should start with an upper case: ```sh Bad: cannot parse input Good: Cannot parse input ``` 2. Messages should not end with a period: ```sh Bad: Cannot parse input. Good: Cannot parse input ``` 3. Message should use quotes for values for strings: ```sh Bad: Cannot parse input hello, world Good: Cannot parse input "hello, world" ``` 4. Message should state the action that lead to the error: ```sh Bad: Invalid input x Good: Cannot parse input x ``` 5. Active voice should be used: ```sh Bad: Input x cannot be parsed Good: Cannot parse input x ``` 6. Messages should not use contractions: ```sh Bad: Can't parse input x Good: Cannot parse input x ``` 7. Messages should use a colon when providing additional information. Periods should never be used. Other punctuation may be used as needed: ```sh Bad: Cannot parse input x. value is empty Good: Cannot parse input x: value is empty ``` 8. Additional information should describe the current state, if possible, it should also describe the desired state in an affirmative voice: ```sh Bad: Cannot compute the square root for x: value must not be negative Good: Cannot compute the square root for x: current value is ${x} Better: Cannot compute the square root for x as x must be >= 0: current value is ${x} ``` ### std #### Do not depend on external code. `https://jsr.io/@std` is intended to be baseline functionality that all Deno programs can rely on. We want to guarantee to users that this code does not include potentially unreviewed third-party code. #### Document and maintain browser compatibility. If a module is browser-compatible, include the following in the JSDoc at the top of the module: ```ts // This module is browser-compatible. ``` Maintain browser compatibility for such a module by either not using the global `Deno` namespace or feature-testing for it. Make sure any new dependencies are also browser compatible. #### Prefer # over private keyword We prefer the private fields (`#`) syntax over `private` keyword of TypeScript in the standard modules codebase. The private fields make the properties and methods private even at runtime. On the other hand, `private` keyword of TypeScript guarantee it private only at compile time and the fields are publicly accessible at runtime. Good: ```ts class MyClass { #foo = 1; #bar() {} } ``` Bad: ```ts class MyClass { private foo = 1; private bar() {} } ``` #### Naming convention Use `camelCase` for functions, methods, fields, and local variables. Use `PascalCase` for classes, types, interfaces, and enums. Use `UPPER_SNAKE_CASE` for static top-level items, such as `string`, `number`, `bigint`, `boolean`, `RegExp`, arrays of static items, records of static keys and values, etc. Good: ```ts function generateKey() {} let currentValue = 0; class KeyObject {} type SharedKey = {}; enum KeyType { PublicKey, PrivateKey, } const KEY_VERSION = "1.0.0"; const KEY_MAX_LENGTH = 4294967295; const KEY_PATTERN = /^[0-9a-f]+$/; ``` Bad: ```ts function generate_key() {} let current_value = 0; function GenerateKey() {} class keyObject {} type sharedKey = {}; enum keyType { publicKey, privateKey, } const key_version = "1.0.0"; const key_maxLength = 4294967295; const KeyPattern = /^[0-9a-f]+$/; ``` When the names are in `camelCase` or `PascalCase`, always follow the rules of them even when the parts of them are acronyms. Note: Web APIs use uppercase acronyms (`JSON`, `URL`, `URL.createObjectURL()` etc.). Deno Standard Library does not follow this convention. Good: ```ts class HttpObject { } ``` Bad: ```ts class HTTPObject { } ``` Good: ```ts function convertUrl(url: URL) { return url.href; } ``` Bad: ```ts function convertURL(url: URL) { return url.href; } ``` --- # deno.json and package.json > The guide to configuring your Deno projects. Learn about TypeScript settings, tasks, dependencies, formatting, linting, and how to use both deno.json and/or package.json effectively. URL: https://docs.deno.com/runtime/fundamentals/configuration You can configure Deno using a `deno.json` file. This file can be used to configure the TypeScript compiler, linter, formatter, and other Deno tools. The configuration file supports `.json` and [`.jsonc`](https://code.visualstudio.com/docs/languages/json#_json-with-comments) extensions. Deno will automatically detect a `deno.json` or `deno.jsonc` configuration file if it's in your current working directory or parent directories. The `--config` flag can be used to specify a different configuration file. ## package.json support Deno also supports a `package.json` file for compatibility with Node.js projects. If you have a Node.js project, it is not necessary to create a `deno.json` file. Deno will use the `package.json` file to configure the project. If both a `deno.json` and `package.json` file are present in the same directory, Deno will understand dependencies specified in both `deno.json` and `package.json`; and use the `deno.json` file for Deno-specific configurations. Read more about [Node compatibility in Deno](/runtime/fundamentals/node/#node-compatibility). ## Dependencies The `"imports"` field in your `deno.json` allows you to specify dependencies used in your project. You can use it to map bare specifiers to URLs or file paths making it easier to manage dependencies and module resolution in your applications. For example, if you want to use the `assert` module from the standard library in your project, you could use this import map: ```json title="deno.json" { "imports": { "@std/assert": "jsr:@std/assert@^1.0.0", "chalk": "npm:chalk@5" } } ``` Then your script can use the bare specifier `std/assert`: ```js title="script.ts" import { assertEquals } from "@std/assert"; import chalk from "chalk"; assertEquals(1, 2); console.log(chalk.yellow("Hello world")); ``` You can also use a `"dependencies"` field in `package.json`: ```json title="package.json" { "dependencies": { "express": "express@^1.0.0" } } ``` ```js title="script.ts" import express from "express"; const app = express(); ``` Note that this will require you to run `deno install`. Read more about [module imports and dependencies](/runtime/fundamentals/modules/) ### Custom path mappings The import map in `deno.json` can be used for more general path mapping of specifiers. You can map an exact specifiers to a third party module or a file directly, or you can map a part of an import specifier to a directory. ```jsonc title="deno.jsonc" { "imports": { // Map to an exact file "foo": "./some/long/path/foo.ts", // Map to a directory, usage: "bar/file.ts" "bar/": "./some/folder/bar/" } } ``` Usage: ```ts import * as foo from "foo"; import * as bar from "bar/file.ts"; ``` Path mapping of import specifies is commonly used in larger code bases for brevity. To use your project root for absolute imports: ```json title="deno.json" { "imports": { "/": "./", "./": "./" } } ``` ```ts title="main.ts" import { MyUtil } from "/util.ts"; ``` This causes import specifiers starting with `/` to be resolved relative to the import map's URL or file path. ### Overriding packages The `links` field in `deno.json` allows you to override dependencies with local packages stored on disk. This is similar to `npm link`. ```json title="deno.json" { "links": [ "../some-package" ] } ``` This capability addresses several common development challenges: - Dependency bug fixes - Private local libraries - Compatibility issues The package being referenced doesn't need to be published at all. It just needs to have the proper package name and metadata in `deno.json` or `package.json`, so that Deno knows what package it's dealing with. This provides greater flexibility and modularity, maintaining clean separation between your main code and external packages. ## Tasks The `tasks` field in your `deno.json` file is used to define custom commands that can be executed with the `deno task` command and allows you to tailor commands and permissions to the specific needs of your project. It is similar to the `scripts` field in a `package.json` file, which is also supported. ```json title="deno.json" { "tasks": { "start": "deno run --allow-net --watch=static/,routes/,data/ dev.ts", "test": "deno test --allow-net", "lint": "deno lint" } } ``` ```json title="package.json" { "scripts": { "dev": "vite dev", "build": "vite build" } } ``` To execute a task, use the `deno task` command followed by the task name. For example: ```sh deno task start deno task test deno task lint deno task dev deno task build ``` Read more about [`deno task`](/runtime/reference/cli/task_runner/). ## Linting The `lint` field in the `deno.json` file is used to configure the behavior of Deno’s built-in linter. This allows you to specify which files to include or exclude from linting, as well as customize the linting rules to suit your project’s needs. For example: ```json title="deno.json" { "lint": { "include": ["src/"], "exclude": ["src/testdata/", "src/fixtures/**/*.ts"], "rules": { "tags": ["recommended"], "include": ["ban-untagged-todo"], "exclude": ["no-unused-vars"] } } } ``` This configuration will: - only lint files in the `src/` directory, - not lint files in the `src/testdata/` directory or any TypeScript files in the `src/fixtures/` directory. - specify that the recommended linting rules should be applied, - add the `ban-untagged-todo`, and - exclude the `no-unused-vars` rule. You can find a full list of available linting rules in the [List of rules](/lint/) documentation page. Read more about [linting with Deno](/runtime/reference/cli/linter/). ## Formatting The `fmt` field in the `deno.json` file is used to configure the behavior of Deno’s built-in code formatter. This allows you to customize how your code is formatted, ensuring consistency across your project, making it easier to read and collaborate on. Here are the key options you can configure: ```json title="deno.json" { "fmt": { "useTabs": true, "lineWidth": 80, "indentWidth": 4, "semiColons": true, "singleQuote": true, "proseWrap": "preserve", "include": ["src/"], "exclude": ["src/testdata/", "src/fixtures/**/*.ts"] } } ``` This configuration will: - use tabs instead of spaces for indentation, - limit lines to 80 characters, - use an indentation width of 4 spaces, - add semicolons to the end of statements, - use single quotes for strings, - preserve prose wrapping, - format files in the `src/` directory, - exclude files in the `src/testdata/` directory and any TypeScript files in the `src/fixtures/` directory. Read more about [formatting your code with Deno](/runtime/fundamentals/linting_and_formatting/). ## Lockfile The `lock` field in the `deno.json` file is used to specify configuration of the lock file that Deno uses to [ensure the integrity of your dependencies](/runtime/fundamentals/modules/#integrity-checking-and-lock-files). A lock file records the exact versions and integrity hashes of the modules your project depends on, ensuring that the same versions are used every time the project is run, even if the dependencies are updated or changed remotely. ```json title="deno.json" { "lock": { "path": "./deno.lock", "frozen": true } } ``` This configuration will: - specify lockfile location at `./deno.lock` (this is the default and can be omitted) - tell Deno that you want to error out if any dependency changes Deno uses lockfile by default, you can disable it with following configuration: ```json title="deno.json" { "lock": false } ``` ## Node modules directory By default Deno uses a local `node_modules` directory if you have a `package.json` file in your project directory. You can control this behavior using the `nodeModulesDir` field in the `deno.json` file. ```json title="deno.json" { "nodeModulesDir": "auto" } ``` You can set this field to following values: | Value | Behavior | | ---------- | ----------------------------------------------------------------------------------------------------------------------------------- | | `"none"` | Don't use a local `node_modules` directory. Instead use global cache in `$DENO_DIR` that is automatically kept up to date by Deno. | | `"auto"` | Use a local `node_modules` directory. The directory is automatically created and kept up to date by Deno. | | `"manual"` | Use a local `node_modules` directory. User must keep this directory up to date manually, eg. using `deno install` or `npm install`. | It is not required to specify this setting, the following defaults are applied: - `"none"` if there is no `package.json` file in your project directory - `"manual"` if there is a `package.json` file in your project directory When using workspaces, this setting can only be used in the workspace root. Specifying it in any of the members will result in warnings. The `"manual"` setting will only be applied automatically if there's a `package.json` file in the workspace root. ## TypeScript compiler options The `compilerOptions` field in the `deno.json` file is used to configure [TypeScript compiler settings](https://www.typescriptlang.org/tsconfig) for your Deno project. This allows you to customize how TypeScript code is compiled, ensuring it aligns with your project’s requirements and coding standards. :::info Deno recommends the default TypeScript configuration. This will help when sharing code. ::: See also [Configuring TypeScript in Deno](/runtime/reference/ts_config_migration/). ## Unstable features The `unstable` field in a `deno.json` file is used to enable specific unstable features for your Deno project. These features are still in development and not yet part of the stable API. By listing features in the `unstable` array, you can experiment with and use these new capabilities before they are officially released. ```json title="deno.json" { "unstable": ["cron", "kv", "webgpu"] } ``` [Learn more](/runtime/reference/cli/unstable_flags/). ## include and exclude Many configurations (ex. `lint`, `fmt`) have an `include` and `exclude` property for specifying the files to include. ### include Only the paths or patterns specified here will be included. ```jsonc { "lint": { // only format the src/ directory "include": ["src/"] } } ``` ### exclude The paths or patterns specified here will be excluded. ```jsonc { "lint": { // don't lint the dist/ folder "exclude": ["dist/"] } } ``` This has HIGHER precedence than `include` and will win over `include` if a path is matched in both `include` and `exclude`. You may wish to exclude a directory, but include a sub directory. In Deno 1.41.2+, you may un-exclude a more specific path by specifying a negated glob below the more general exclude: ```jsonc { "fmt": { // don't format the "fixtures" directory, // but do format "fixtures/scripts" "exclude": [ "fixtures", "!fixtures/scripts" ] } } ``` ### Top level exclude If there's a directory you never want Deno to fmt, lint, type check, analyze in the LSP, etc., then specify it in the top level exclude array: ```jsonc { "exclude": [ // exclude the dist folder from all sub-commands and the LSP "dist/" ] } ``` Sometimes you may find that you want to un-exclude a path or pattern that's excluded in the top level-exclude. In Deno 1.41.2+, you may un-exclude a path by specifying a negated glob in a more specific config: ```jsonc { "fmt": { "exclude": [ // format the dist folder even though it's // excluded at the top level "!dist" ] }, "exclude": [ "dist/" ] } ``` ### Publish - Override .gitignore The `.gitignore` is taken into account for the `deno publish` command. In Deno 1.41.2+, you can opt-out of excluded files ignored in the _.gitignore_ by using a negated exclude glob: ```title=".gitignore" dist/ .env ``` ```jsonc title="deno.json" { "publish": { "exclude": [ // include the .gitignored dist folder "!dist/" ] } } ``` Alternatively, explicitly specifying the gitignored paths in an `"include"` works as well: ```json { "publish": { "include": [ "dist/", "README.md", "deno.json" ] } } ``` ## An example `deno.json` file ```json { "compilerOptions": { "allowJs": true, "lib": ["deno.window"], "strict": true }, "lint": { "include": ["src/"], "exclude": ["src/testdata/", "src/fixtures/**/*.ts"], "rules": { "tags": ["recommended"], "include": ["ban-untagged-todo"], "exclude": ["no-unused-vars"] } }, "fmt": { "useTabs": true, "lineWidth": 80, "indentWidth": 4, "semiColons": false, "singleQuote": true, "proseWrap": "preserve", "include": ["src/"], "exclude": ["src/testdata/", "src/fixtures/**/*.ts"] }, "lock": false, "nodeModulesDir": "auto", "unstable": ["webgpu"], "test": { "include": ["src/"], "exclude": ["src/testdata/", "src/fixtures/**/*.ts"] }, "tasks": { "start": "deno run --allow-read main.ts" }, "imports": { "oak": "jsr:@oak/oak" }, "exclude": [ "dist/" ] } ``` This is an example of a `deno.json` file that configures the TypeScript compiler options, linter, formatter, node modules directory, etc. For a full list of available fields and configurations, see the [Deno configuration file schema](#json-schema). ## JSON schema A JSON schema file is available for editors to provide autocompletion. The file is versioned and available at: [https://github.com/denoland/deno/blob/main/cli/schemas/config-file.v1.json](https://github.com/denoland/deno/blob/main/cli/schemas/config-file.v1.json) ## Proxies Deno supports proxies for module downloads and the fetch API. Proxy configuration is read from [environment variables](https://docs.deno.com/runtime/reference/env_variables/#special-environment-variables): HTTP_PROXY, HTTPS_PROXY and NO_PROXY. If you are using Windows - if environment variables are not found Deno falls back to reading proxies from the registry. --- # Debugging > Complete guide to debugging Deno applications. Learn to use Chrome DevTools, VS Code debugger, and other debugging techniques for TypeScript/JavaScript code in Deno. URL: https://docs.deno.com/runtime/fundamentals/debugging Deno supports the [V8 Inspector Protocol](https://v8.dev/docs/inspector) used by Chrome, Edge and Node.js. This makes it possible to debug Deno programs using Chrome DevTools or other clients that support the protocol (for example VSCode). To activate debugging capabilities run Deno with one of the following flags: - `--inspect` - `--inspect-wait` - `--inspect-brk` ## --inspect Using the `--inspect` flag will start your program with an inspector server which allows client connections from tools that support the V8 Inspector Protocol, for example Chrome DevTools. Visit `chrome://inspect` in a Chromium derived browser to connect Deno to the inspector server. This allows you to inspect your code, add breakpoints, and step through your code. ```sh deno run --inspect your_script.ts ``` :::note If you use the `--inspect` flag, the code will start executing immediately. If your program is short, you might not have enough time to connect the debugger before the program finishes execution. In such cases, try running with `--inspect-wait` or `--inspect-brk` flag instead, or add a timeout at the end of your code. ::: ## --inspect-wait The `--inspect-wait` flag will wait for a debugger to connect before executing your code. ```sh deno run --inspect-wait your_script.ts ``` ## --inspect-brk The `--inspect-brk` flag will wait for a debugger to connect before executing your code and then put a breakpoint in your program as soon as you connect, allowing you to add additional breakpoints or evaluate expressions before resuming execution. **This is the most commonly used inspect flag**. JetBrains and VSCode IDEs use this flag by default. ```sh deno run --inspect-brk your_script.ts ``` ## Example with Chrome DevTools Let's try debugging a program using Chrome Devtools. For this, we'll use [@std/http/file-server](https://jsr.io/@std/http#file-server), a static file server. Use the `--inspect-brk` flag to break execution on the first line: ```sh $ deno run --inspect-brk -RN jsr:@std/http/file-server Debugger listening on ws://127.0.0.1:9229/ws/1e82c406-85a9-44ab-86b6-7341583480b1 ... ``` In a Chromium derived browser such as Google Chrome or Microsoft Edge, open `chrome://inspect` and click `Inspect` next to target: ![chrome://inspect](./images/debugger1.png) It might take a few seconds after opening the DevTools to load all modules. ![DevTools opened](./images/debugger2.jpg) You might notice that DevTools pauses execution on the first line of `_constants.ts` instead of `file_server.ts`. This is expected behavior caused by the way ES modules are evaluated in JavaScript (`_constants.ts` is left-most, bottom-most dependency of `file_server.ts` so it is evaluated first). At this point all source code is available in the DevTools, so let's open up `file_server.ts` and add a breakpoint there; go to "Sources" pane and expand the tree: ![Open file_server.ts](./images/debugger3.jpg) _Looking closely you'll find duplicate entries for each file; one written regularly and one in italics. The former is compiled source file (so in the case of `.ts` files it will be emitted JavaScript source), while the latter is a source map for the file._ Next, add a breakpoint in the `listenAndServe` method: ![Break in file_server.ts](./images/debugger4.jpg) As soon as we've added the breakpoint, DevTools automatically opens up the source map file, which allows us step through the actual source code that includes types. Now that we have our breakpoints set, we can resume the execution of our script so that we can inspect an incoming request. Hit the "Resume script execution" button to do so. You might even need to hit it twice! Once our script is running, try send a request and inspect it in Devtools: ```sh curl http://0.0.0.0:4507/ ``` ![Break in request handling](./images/debugger5.jpg) At this point we can introspect the contents of the request and go step-by-step to debug the code. ## VSCode Deno can be debugged using VSCode. This is best done with help from the official `vscode_deno` extension. Documentation for this can be found [here](/runtime/reference/vscode#using-the-debugger). ## JetBrains IDEs _**Note**: make sure you have [this Deno plugin](https://plugins.jetbrains.com/plugin/14382-deno) installed and enabled in Preferences / Settings | Plugins. For more information, see [this blog post](https://blog.jetbrains.com/webstorm/2020/06/deno-support-in-jetbrains-ides/)._ You can debug Deno using your JetBrains IDE by right-clicking the file you want to debug and selecting the `Debug 'Deno: '` option. ![Debug file](./images/jb-ide-debug.png) This will create a run/debug configuration with no permission flags set. If you want to configure them, open your run/debug configuration and add the required flags to the `Command` field. ## --log-level=debug If you're having trouble connecting to the inspector, you can use the `--log-level=debug` flag to get more information about what's happening. This will show you information like module resolution, network requests, and other permission checks. ```sh deno run --inspect-brk --log-level=debug your_script.ts ``` ## --strace-ops Deno ops are an [RPC](https://en.wikipedia.org/wiki/Remote_procedure_call) mechanism between JavaScript and Rust. They provide functionality like file I/O, networking, and timers to JavaScript. The `--strace-ops` flag will print out all ops that are being executed by Deno when a program is run along with their timings. ```sh deno run --strace-ops your_script.ts ``` Each op should have a `Dispatch` and a `Complete` event. The time between these two events is the time taken to execute the op. This flag can be useful for performance profiling, debugging hanging programs, or understanding how Deno works under the hood. ## OpenTelemetry integration For production applications or complex systems, OpenTelemetry provides a more comprehensive approach to observability and debugging. Deno includes built-in support for OpenTelemetry, allowing you to: - Trace requests through your application - Monitor application performance metrics - Collect structured logs - Export telemetry data to monitoring systems To enable OpenTelemetry, run your application with the `--unstable-otel` flag: ```sh OTEL_DENO=true deno run --unstable-otel your_script.ts ``` This will automatically collect and export runtime observability data, including: - HTTP request traces - Runtime metrics - Console logs and errors For full details on Deno's OpenTelemetry integration, including custom metrics, traces, and configuration options, see the [OpenTelemetry documentation](/runtime/fundamentals/open_telemetry). --- # Foreign Function Interface (FFI) > Learn how to use Deno's Foreign Function Interface (FFI) to call native libraries directly from JavaScript or TypeScript. Includes examples, best practices, and security considerations. URL: https://docs.deno.com/runtime/fundamentals/ffi Deno's Foreign Function Interface (FFI) allows JavaScript and TypeScript code to call functions in dynamic libraries written in languages like C, C++, or Rust. This enables you to integrate native code performance and capabilities directly into your Deno applications. Deno FFI Reference Docs ## Introduction to FFI FFI provides a bridge between Deno's JavaScript runtime and native code. This allows you to: - Use existing native libraries within your Deno applications - Implement performance-critical code in languages like Rust or C - Access operating system APIs and hardware features not directly available in JavaScript Deno's FFI implementation is based on the `Deno.dlopen` API, which loads dynamic libraries and creates JavaScript bindings to the functions they export. ## Security considerations FFI requires explicit permission using the [`--allow-ffi`](/runtime/fundamentals/security#ffi-foreign-function-interface) flag, as native code runs outside of Deno's security sandbox: ```sh deno run --allow-ffi my_ffi_script.ts ``` Important security warning: Unlike JavaScript code running in the Deno sandbox, native libraries loaded via FFI have the same access level as the Deno process itself. This means they can: - Access the filesystem - Make network connections - Access environment variables - Execute system commands Always ensure you trust the native libraries you're loading through FFI. ## Basic usage The basic pattern for using FFI in Deno involves: 1. Defining the interface for the native functions you want to call 2. Loading the dynamic library using `Deno.dlopen()` 3. Calling the loaded functions Here's a simple example loading a C library: ```ts const dylib = Deno.dlopen("libexample.so", { add: { parameters: ["i32", "i32"], result: "i32" }, }); console.log(dylib.symbols.add(5, 3)); // 8 dylib.close(); ``` ## Supported types Deno's FFI supports a variety of data types for parameters and return values: | FFI Type | Deno | C | Rust | | ---------------------- | -------------------- | ------------------------ | ------------------------- | | `i8` | `number` | `char` / `signed char` | `i8` | | `u8` | `number` | `unsigned char` | `u8` | | `i16` | `number` | `short int` | `i16` | | `u16` | `number` | `unsigned short int` | `u16` | | `i32` | `number` | `int` / `signed int` | `i32` | | `u32` | `number` | `unsigned int` | `u32` | | `i64` | `bigint` | `long long int` | `i64` | | `u64` | `bigint` | `unsigned long long int` | `u64` | | `usize` | `bigint` | `size_t` | `usize` | | `isize` | `bigint` | `size_t` | `isize` | | `f32` | `number` | `float` | `f32` | | `f64` | `number` | `double` | `f64` | | `void`[1] | `undefined` | `void` | `()` | | `pointer` | `{} \| null` | `void *` | `*mut c_void` | | `buffer`[2] | `TypedArray \| null` | `uint8_t *` | `*mut u8` | | `function`[3] | `{} \| null` | `void (*fun)()` | `Option` | | `{ struct: [...] }`[4] | `TypedArray` | `struct MyStruct` | `MyStruct` | As of Deno 1.25, the `pointer` type has been split into a `pointer` and a `buffer` type to ensure users take advantage of optimizations for Typed Arrays, and as of Deno 1.31 the JavaScript representation of `pointer` has become an opaque pointer object or `null` for null pointers. - [1] `void` type can only be used as a result type. - [2] `buffer` type accepts TypedArrays as parameter, but it always returns a pointer object or `null` when used as result type like the `pointer` type. - [3] `function` type works exactly the same as the `pointer` type as a parameter and result type. - [4] `struct` type is for passing and returning C structs by value (copy). The `struct` array must enumerate each of the struct's fields' type in order. The structs are padded automatically: Packed structs can be defined by using an appropriate amount of `u8` fields to avoid padding. Only TypedArrays are supported as structs, and structs are always returned as `Uint8Array`s. ## Working with structs You can define and use C structures in your FFI code: ```ts // Define a struct type for a Point const pointStruct = { fields: { x: "f64", y: "f64", }, } as const; // Define the library interface const signatures = { distance: { parameters: [ { struct: pointStruct }, { struct: pointStruct }, ], result: "f64", }, } as const; // Create struct instances const point1 = new Deno.UnsafePointer( new BigUint64Array([ BigInt(Float64Array.of(1.0).buffer), BigInt(Float64Array.of(2.0).buffer), ]).buffer, ); const point2 = new Deno.UnsafePointer( new BigUint64Array([ BigInt(Float64Array.of(4.0).buffer), BigInt(Float64Array.of(6.0).buffer), ]).buffer, ); // Call the function with structs const dist = dylib.symbols.distance(point1, point2); ``` ## Working with callbacks You can pass JavaScript functions as callbacks to native code: ```ts const signatures = { setCallback: { parameters: ["function"], result: "void", }, runCallback: { parameters: [], result: "void", }, } as const; // Create a callback function const callback = new Deno.UnsafeCallback( { parameters: ["i32"], result: "void" } as const, (value) => { console.log("Callback received:", value); }, ); // Pass the callback to the native library dylib.symbols.setCallback(callback.pointer); // Later, this will trigger our JavaScript function dylib.symbols.runCallback(); // Always clean up when done callback.close(); ``` ## Best practices with FFI 1. Always close resources. Close libraries with `dylib.close()` and callbacks with `callback.close()` when done. 2. Prefer TypeScript. Use TypeScript for better type-checking when working with FFI. 3. Wrap FFI calls in try/catch blocks to handle errors gracefully. 4. Be extremely careful when using FFI, as native code can bypass Deno's security sandbox. 5. Keep the FFI interface as small as possible to reduce the attack surface. ## Examples ### Using a Rust library Here's an example of creating and using a Rust library with Deno: First, create a Rust library: ```rust // lib.rs #[no_mangle] pub extern "C" fn fibonacci(n: u32) -> u32 { if n <= 1 { return n; } fibonacci(n - 1) + fibonacci(n - 2) } ``` Compile it as a dynamic library: ```sh rustc --crate-type cdylib lib.rs ``` Then use it from Deno: ```ts const libName = { windows: "./lib.dll", linux: "./liblib.so", darwin: "./liblib.dylib", }[Deno.build.os]; const dylib = Deno.dlopen( libName, { fibonacci: { parameters: ["u32"], result: "u32" }, } as const, ); // Calculate the 10th Fibonacci number const result = dylib.symbols.fibonacci(10); console.log(`Fibonacci(10) = ${result}`); // 55 dylib.close(); ``` ### Examples - [Netsaur](https://github.com/denosaurs/netsaur/blob/c1efc3e2df6e2aaf4a1672590a404143203885a6/packages/core/src/backends/cpu/mod.ts) - [WebView_deno](https://github.com/webview/webview_deno/blob/main/src/ffi.ts) - [Deno_sdl2](https://github.com/littledivy/deno_sdl2/blob/main/mod.ts) - [Deno FFI examples repository](https://github.com/denoffi/denoffi_examples) These community-maintained repos includes working examples of FFI integrations with various native libraries across different operating systems. ## Related Approaches to Native Code Integration While Deno's FFI provides a direct way to call native functions, there are other approaches to integrate native code: ### Using Node-API (N-API) with Deno Deno supports [Node-API (N-API)](https://nodejs.org/api/n-api.html) for compatibility with native Node.js addons. This enables you to use existing native modules written for Node.js. Directly loading a Node-API addon: ```ts import process from "node:process"; process.dlopen(module, "./native_module.node", 0); ``` Using an npm package that uses a Node-API addon: ```ts import someNativeAddon from "npm:some-native-addon"; console.log(someNativeAddon.doSomething()); ``` How is this different from FFI? | **Aspect** | **FFI** | **Node-API Support** | | ----------- | ---------------------- | ------------------------------------------- | | Setup | No build step required | Requires precompiled binaries or build step | | Portability | Tied to library ABI | ABI-stable across versions | | Use Case | Direct library calls | Reuse Node.js addons | Node-API support is ideal for leveraging existing Node.js native modules, whereas FFI is best for direct, lightweight calls to native libraries. ## Alternatives to FFI Before using FFI, consider these alternatives: - [WebAssembly](/runtime/reference/wasm/), for portable native code that runs within Deno's sandbox. - Use `Deno.command` to execute external binaries and subprocesses with controlled permissions. - Check whether [Deno's native APIs](/api/deno) already provide the functionality you need. Deno's FFI capabilities provide powerful integration with native code, enabling performance optimizations and access to system-level functionality. However, this power comes with significant security considerations. Always be cautious when working with FFI and ensure you trust the native libraries you're using. --- # Writing an HTTP Server > A guide to creating HTTP servers in Deno. Learn about the Deno.serve API, request handling, WebSocket support, response streaming, and how to build production-ready HTTP/HTTPS servers with automatic compression. URL: https://docs.deno.com/runtime/fundamentals/http_server HTTP servers are the backbone of the web, allowing you to access websites, download files, and interact with web services. They listen for incoming requests from clients (like web browsers) and send back responses. When you build your own HTTP server, you have complete control over its behavior and can tailor it to your specific needs. You may be using it for local development, to serve your HTML, CSS, and JS files, or building a REST API - having your own server lets you define endpoints, handle requests and manage data. ## Deno's built-in HTTP server Deno has a built in HTTP server API that allows you to write HTTP servers. The [`Deno.serve`](https://docs.deno.com/api/deno/~/Deno.serve) API supports HTTP/1.1 and HTTP/2. ### A "Hello World" server The `Deno.serve` function takes a handler function that will be called for each incoming request, and is expected to return a response (or a promise resolving to a response). Here is an example of a server that returns a "Hello, World!" response for each request: ```ts title="server.ts" Deno.serve((_req) => { return new Response("Hello, World!"); }); ``` The handler can also return a `Promise`, which means it can be an `async` function. To run this server, you can use the `deno run` command: ```sh deno run --allow-net server.ts ``` ### Listening on a specific port By default `Deno.serve` will listen on port `8000`, but this can be changed by passing in a port number in options bag as the first or second argument: ```js title="server.ts" // To listen on port 4242. Deno.serve({ port: 4242 }, handler); // To listen on port 4242 and bind to 0.0.0.0. Deno.serve({ port: 4242, hostname: "0.0.0.0" }, handler); ``` ### Inspecting the incoming request Most servers will not answer with the same response for every request. Instead they will change their answer depending on various aspects of the request: the HTTP method, the headers, the path, or the body contents. The request is passed in as the first argument to the handler function. Here is an example showing how to extract various parts of the request: ```ts Deno.serve(async (req) => { console.log("Method:", req.method); const url = new URL(req.url); console.log("Path:", url.pathname); console.log("Query parameters:", url.searchParams); console.log("Headers:", req.headers); if (req.body) { const body = await req.text(); console.log("Body:", body); } return new Response("Hello, World!"); }); ``` :::caution Be aware that the `req.text()` call can fail if the user hangs up the connection before the body is fully received. Make sure to handle this case. Do note this can happen in all methods that read from the request body, such as `req.json()`, `req.formData()`, `req.arrayBuffer()`, `req.body.getReader().read()`, `req.body.pipeTo()`, etc. ::: ### Responding with real data Most servers do not respond with "Hello, World!" to every request. Instead they might respond with different headers, status codes, and body contents (even body streams). Here is an example of returning a response with a 404 status code, a JSON body, and a custom header: ```ts title="server.ts" Deno.serve((req) => { const body = JSON.stringify({ message: "NOT FOUND" }); return new Response(body, { status: 404, headers: { "content-type": "application/json; charset=utf-8", }, }); }); ``` ### Responding with a stream Response bodies can also be streams. Here is an example of a response that returns a stream of "Hello, World!" repeated every second: ```ts title="server.ts" Deno.serve((req) => { let timer: number; const body = new ReadableStream({ async start(controller) { timer = setInterval(() => { controller.enqueue("Hello, World!\n"); }, 1000); }, cancel() { clearInterval(timer); }, }); return new Response(body.pipeThrough(new TextEncoderStream()), { headers: { "content-type": "text/plain; charset=utf-8", }, }); }); ``` :::note Note the `cancel` function above. This is called when the client hangs up the connection. It is important to make sure that you handle this case, otherwise the server will keep queuing up messages forever, and eventually run out of memory. ::: Be aware that the response body stream is "cancelled" when the client hangs up the connection. Make sure to handle this case. This can surface itself as an error in a `write()` call on a `WritableStream` object that is attached to the response body `ReadableStream` object (for example through a `TransformStream`). ### HTTPS support To use HTTPS, pass two extra arguments in the options: `cert` and `key`. These are contents of the certificate and key files, respectively. ```js Deno.serve({ port: 443, cert: Deno.readTextFileSync("./cert.pem"), key: Deno.readTextFileSync("./key.pem"), }, handler); ``` :::note To use HTTPS, you will need a valid TLS certificate and a private key for your server. ::: ### HTTP/2 support HTTP/2 support is "automatic" when using the HTTP server APIs with Deno. You just need to create your server, and it will handle HTTP/1 or HTTP/2 requests seamlessly. HTTP/2 is also supported over cleartext with prior knowledge. ### Automatic body compression The HTTP server has built in automatic compression of response bodies. When a response is sent to a client, Deno determines if the response body can be safely compressed. This compression happens within the internals of Deno, so it is fast and efficient. Currently Deno supports gzip and brotli compression. A body is automatically compressed if the following conditions are true: - The request has an [`Accept-Encoding`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Accept-Encoding) header which indicates the requester supports `br` for Brotli or `gzip`. Deno will respect the preference of the [quality value](https://developer.mozilla.org/en-US/docs/Glossary/Quality_values) in the header. - The response includes a [`Content-Type`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Type) which is considered compressible. (The list is derived from [`jshttp/mime-db`](https://github.com/jshttp/mime-db/blob/master/db.json) with the actual list [in the code](https://github.com/denoland/deno/blob/v1.21.0/ext/http/compressible.rs).) - The response body is greater than 64 bytes. When the response body is compressed, Deno will set the `Content-Encoding` header to reflect the encoding, as well as ensure the `Vary` header is adjusted or added to indicate which request headers affected the response. In addition to the logic above, there are a few reasons why a response **won’t** be compressed automatically: - The response contains a `Content-Encoding` header. This indicates your server has done some form of encoding already. - The response contains a [`Content-Range`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Range) header. This indicates that your server is responding to a range request, where the bytes and ranges are negotiated outside of the control of the internals to Deno. - The response has a [`Cache-Control`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control) header which contains a [`no-transform`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control#other) value. This indicates that your server doesn’t want Deno or any downstream proxies to modify the response. ### Serving WebSockets Deno can upgrade incoming HTTP requests to a WebSocket. This allows you to handle WebSocket endpoints on your HTTP servers. To upgrade an incoming `Request` to a WebSocket you use the `Deno.upgradeWebSocket` function. This returns an object consisting of a `Response` and a web standard `WebSocket` object. The returned response should be used to respond to the incoming request. Because the WebSocket protocol is symmetrical, the `WebSocket` object is identical to the one that can be used for client side communication. Documentation for it can be found [on MDN](https://developer.mozilla.org/en-US/docs/Web/API/WebSocket). ```ts title="server.ts" Deno.serve((req) => { if (req.headers.get("upgrade") != "websocket") { return new Response(null, { status: 501 }); } const { socket, response } = Deno.upgradeWebSocket(req); socket.addEventListener("open", () => { console.log("a client connected!"); }); socket.addEventListener("message", (event) => { if (event.data === "ping") { socket.send("pong"); } }); return response; }); ``` The connection the WebSocket was created on can not be used for HTTP traffic after a WebSocket upgrade has been performed. :::note Note that WebSockets are only supported on HTTP/1.1 for now. ::: ## Default fetch export Another way to create an HTTP server in Deno is by exporting a default `fetch` function. [The fetch API](/api/web/~/fetch) initiates an HTTP request to retrieve data from across a network and is built into the Deno runtime. ```ts title="server.ts" export default { fetch(request) { const userAgent = request.headers.get("user-agent") || "Unknown"; return new Response(`User Agent: ${userAgent}`); }, } satisfies Deno.ServeDefaultExport; ``` You can run this file with the `deno serve` command: ```sh deno serve server.ts ``` The server will start and display a message in the console. Open your browser and navigate to [http://localhost:8000/](http://localhost:8000/) to see the user-agent information. The [`Deno.ServeDefaultExport`](https://docs.deno.com/api/deno/~/Deno.ServeDefaultExport) interface defines the structure for default exports that can be used with the `deno serve` command. To ensure your code is type-checked properly, make sure to add `satisfies Deno.ServeDefaultExport` to the `export default { ... }`. ## Building on these examples You will likely want to expand on these examples to create more complex servers. Deno recommends using [Oak](https://jsr.io/@oak/oak) for building web servers. Oak is a middleware framework for Deno's HTTP server, designed to be expressive and easy to use. It provides a simple way to create web servers with middleware support. Check out the [Oak documentation](https://oakserver.github.io/oak/) for examples of how to define routes. --- # runtime/fundamentals/index.md > A guide to Deno's fundamental concepts and features. Learn about built-in tooling, TypeScript support, Node.js compatibility, security model, and modern JavaScript features that make Deno powerful and developer-friendly. URL: https://docs.deno.com/runtime/fundamentals/ Deno is designed with the developer in mind, aiming to provide a smooth and enjoyable development process. Its simplicity and efficiency make it quick and easy to pick up, even for those new to the backend development. ## Built in tooling Deno's inbuilt tooling significantly eases the onboarding process. With a single executable, you can get started without worrying about complex setups or dependencies. This allows you to focus on writing code rather than configuring your environment. - [Configuring your project](/runtime/fundamentals/configuration/) - [TypeScript support](/runtime/fundamentals/typescript/) - [Linting and formatting](/runtime/fundamentals/linting_and_formatting/) - [Testing](/runtime/fundamentals/testing/) - [Debugging](/runtime/fundamentals/debugging/) - [HTTP server](/runtime/fundamentals/http_server/) ## Node and npm Support Deno supports Node.js and npm packages, enabling you to leverage the vast ecosystem of existing libraries and tools. This compatibility ensures that you can integrate Deno into your projects seamlessly. - [Node.js compatibility](/runtime/fundamentals/node/) - [npm compatibility](/runtime/fundamentals/node/#using-npm-packages) ## Standard Library Deno comes with a comprehensive standard library written in TypeScript. This library includes modules for common tasks such as HTTP servers, file system operations, and more, allowing you to avoid "reinventing the wheel" and focus on your application's features. - [Standard Library](/runtime/fundamentals/standard_library/) ## Secure by Default Security is a top priority for Deno. By default, it requires explicit permission for file, network, and environment access, reducing the risk of security vulnerabilities. This secure-by-default approach helps protect your applications from potential threats. - [Security and permissions](/runtime/fundamentals/security/) - [Foreign Function Interface (FFI)](/runtime/fundamentals/ffi/) ## Modern Language Features Deno embraces modern JavaScript features, including ESModules. This means you can use the latest syntax and capabilities of the language, ensuring your code is up-to-date and leveraging the best practices in the industry. - [Using ESModules](/runtime/fundamentals/modules/) - [Migrating from CJS to ESM](/runtime/tutorials/cjs_to_esm/) --- # Linting and formatting > A guide to Deno's built-in code quality tools. Learn how to use deno lint and deno fmt commands, configure rules, integrate with CI/CD pipelines, and maintain consistent code style across your projects. URL: https://docs.deno.com/runtime/fundamentals/linting_and_formatting In an ideal world, your code is always clean, consistent, and free of pesky errors. That's the promise of Deno's built-in linting and formatting tools. By integrating these features directly into the runtime, Deno eliminates the need for external dependencies and complex configurations in your projects. These inbuilt tools are fast and performant, not only saving time but also ensuring that every line of code adheres to best practices. With `deno fmt` and `deno lint`, you can focus on writing great code, knowing that Deno has your back. It's like having a vigilant assistant who keeps your codebase in top shape, allowing you to concentrate on what truly matters: building amazing applications. ## Linting Explore all the lint rules Linting is the process of analyzing your code for potential errors, bugs, and stylistic issues. Deno's built-in linter, [`deno lint`](/runtime/reference/cli/linter/), supports recommended set of rules from [ESLint](https://eslint.org/) to provide comprehensive feedback on your code. This includes identifying syntax errors, enforcing coding conventions, and highlighting potential issues that could lead to bugs. To run the linter, use the following command in your terminal: ```bash deno lint ``` By default, `deno lint` analyzes all TypeScript and JavaScript files in the current directory and its subdirectories. If you want to lint specific files or directories, you can pass them as arguments to the command. For example: ```bash deno lint src/ ``` This command will lint all files in the `src/` directory. The linter can be configured in a [`deno.json`](/runtime/fundamentals/configuration/#linting) file. You can specify custom rules, plugins, and settings to tailor the linting process to your needs. ### Linting rules You can view and search the list of available rules and their usage on the [List of rules](/lint/) documentation page. ## Formatting Formatting is the process of automatically adjusting the layout of your code to adhere to a consistent style. Deno's built-in formatter, `deno fmt`, uses the powerful [dprint](https://dprint.dev/) engine to ensure that your code is always clean, readable, and consistent. To format your code, simply execute the following command in your terminal: ```bash deno fmt ``` By default, `deno fmt` formats all TypeScript and JavaScript files in the current directory and its subdirectories. If you want to format specific files or directories, you can pass them as arguments to the command. For example: ```bash deno fmt src/ ``` This command will format all files in the `src/` directory. ### Checking your formatting The `deno fmt --check` command is used to verify if your code is properly formatted according to Deno's default formatting rules. Instead of modifying the files, it checks them and reports any formatting issues. This is particularly useful for integrating into continuous integration (CI) pipelines or pre-commit hooks to ensure code consistency across your project. If there are formatting issues, `deno fmt --check` will list the files that need formatting. If all files are correctly formatted, it will simply exit without any output. ### Integration in CI You can add `deno fmt --check` to your CI pipeline to automatically check for formatting issues. For example, in a GitHub Actions workflow: ```yaml jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: denoland/setup-deno@v2 with: deno-version: v2.x - run: deno fmt --check ``` This ensures that any code changes adhere to the project's formatting standards before being merged. ### Available options #### `bracePosition` Define brace position for blocks - **Default:** `sameLine` - **Possible values:** `maintain`, `sameLine`, `nextLine`, `sameLineUnlessHanging` #### `jsx.bracketPosition` Define bracket position for JSX - **Default:** `nextLine` - **Possible values:** `maintain`, `sameLine`, `nextLine` #### `jsx.forceNewLinesSurroundingContent` Forces newlines surrounding the content of JSX elements - **Default:** `false` - **Possible values:** `true`, `false` #### `jsx.multiLineParens` Surrounds the top-most JSX element or fragment in parentheses when it spans multiple lines - **Default:** `prefer` - **Possible values:** `never`, `prefer`, `always` #### `indentWidth` Define indentation width - **Default:** `2` - **Possible values:** `number` #### `lineWidth` Define maximum line width - **Default:** `80` - **Possible values:** `number` #### `newLineKind` The newline character to use - **Default:** `lf` - **Possible values:** `auto`, `crlf`, `lf`, `system` #### `nextControlFlowPosition` Define position of next control flow - **Default:** `sameLine` - **Possible values:** `sameLine`, `nextLine`, `maintain` #### `semiColons` Whether to prefer using semicolons. - **Default:** `true` - **Possible values:** `true`, `false` #### `operatorPosition` Where to place the operator for expressions that span multiple lines - **Default:** `sameLine` - **Possible values:** `sameLine`, `nextLine`, `maintain` #### `proseWrap` Define how prose should be wrapped - **Default:** `always` - **Possible values:** `always`, `never`, `preserve` #### `quoteProps` Control quoting of object properties - **Default:** `asNeeded` - **Possible values:** `asNeeded`, `consistent`, `preserve` #### `singleBodyPosition` The position of the body in single body blocks - **Default:** `sameLineUnlessHanging` - **Possible values:** `sameLine`, `nextLine`, `maintain`, `sameLineUnlessHanging` #### `singleQuote` Use single quotes - **Default:** `false` - **Possible values:** `true`, `false` #### `spaceAround` Control spacing around enclosed expressions - **Default:** `false` - **Possible values:** `true`, `false` #### `spaceSurroundingProperties` Control spacing surrounding single line object-like nodes - **Default:** `true` - **Possible values:** `true`, `false` #### `trailingCommas` Control trailing commas in multi-line arrays/objects - **Default:** `always` - **Possible values:** `always`, `never` #### `typeLiteral.separatorKind` Define separator kind for type literals - **Default:** `semiColon` - **Possible values:** `comma`, `semiColon` #### unstable-component Enable formatting Svelte, Vue, Astro and Angular files #### `unstable-sql` Enable formatting SQL files #### `useTabs` Use tabs instead of spaces for indentation - **Default:** `false` - **Possible values:** `true`, `false` #### `useBraces` Whether to use braces for if statements, for statements, and while statements - **Default:** `whenNotSingleLine` - **Possible values:** `maintain`, `whenNotSingleLine`, `always`, preferNone ### Configuration The formatter can be configured in a [`deno.json`](/runtime/fundamentals/configuration/#formatting) file. You can specify custom settings to tailor the formatting process to your needs. --- # Modules and dependencies > A guide to managing modules and dependencies in Deno. Learn about ECMAScript modules, third-party packages, import maps, dependency management, versioning, and how to publish your own modules. URL: https://docs.deno.com/runtime/fundamentals/modules Deno uses [ECMAScript modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) as its default module system to align with modern JavaScript standards and to promote a more efficient and consistent development experience. It's the official standard for JavaScript modules, allows for better tree-shaking, improved tooling integration, and native support across different environments. By adopting ECMAScript modules, Deno ensures compatibility with the ever-evolving JavaScript ecosystem. For developers, this means a streamlined and predictable module system that avoids the complexities associated with legacy module formats like CommonJS. ## Importing modules In this example the `add` function is imported from a local `calc.ts` module. ```ts title="calc.ts" export function add(a: number, b: number): number { return a + b; } ``` ```ts title="main.ts" // imports the `calc.ts` module next to this file import { add } from "./calc.ts"; console.log(add(1, 2)); // 3 ``` You can run this example by calling `deno run main.ts` in the directory that contains both `main.ts` and `calc.ts`. With ECMAScript modules, local import specifiers must always include the full file extension. It cannot be omitted. ```ts title="example.ts" // WRONG: missing file extension import { add } from "./calc"; // CORRECT: includes file extension import { add } from "./calc.ts"; ``` ## Import attributes Deno supports the `with { type: "json" }` import attribute syntax for importing JSON files: ```ts import data from "./data.json" with { type: "json" }; console.log(data.property); // Access JSON data as an object ``` Starting with Deno 2.4 it's possible to import `text` and `bytes` modules too. :::info Support for importing `text` and `bytes` modules is experimental and requires `--unstable-raw-imports` CLI flag or `unstable.raw-import` option in `deno.json`. ::: ```ts import text from "./log.txt" with { type: "text" }; console.log(typeof text === "string"); // true console.log(text); // Hello from a text file ``` ```ts import bytes from "./image.png" with { type: "bytes" }; console.log(bytes instanceof Uint8Array); // true console.log(bytes); Uint8Array(12) [ // 72, 101, 108, 108, 111, // 44, 32, 68, 101, 110, // 111, 33 // ] ``` ## WebAssembly modules Deno supports importing Wasm modules directly: ```ts import { add } from "./add.wasm"; console.log(add(1, 2)); ``` To learn more, visit [WebAssembly section](/runtime/reference/wasm/#wasm-modules) ## Data URL imports Deno supports importing of data URLs, which allows you to import content that isn't in a separate file. This is useful for testing, prototyping, or when you need to programmatically generate modules. You can create modules on the fly using the `data:` URL scheme: ```ts // Import a simple JavaScript module from a data URL import * as module from "data:application/javascript;base64,ZXhwb3J0IGNvbnN0IG1lc3NhZ2UgPSAiSGVsbG8gZnJvbSBkYXRhIFVSTCI7"; console.log(module.message); // Outputs: Hello from data URL // You can also use the non-base64 format const plainModule = await import( "data:application/javascript,export function greet() { return 'Hi there!'; }" ); console.log(plainModule.greet()); // Outputs: Hi there! // A simpler example with text content const textModule = await import( "data:text/plain,export default 'This is plain text'" ); console.log(textModule.default); // Outputs: This is plain text ``` The data URL format follows this pattern: ```sh data:[][;base64], ``` For JavaScript modules, use `application/javascript` as the media type. TypeScript is also supported with `application/typescript`. This feature is particularly useful for testing modules in isolation and creating mock modules during tests. ## Importing third party modules and libraries When working with third-party modules in Deno, use the same `import` syntax as you do for local code. Third party modules are typically imported from a remote registry and start with `jsr:` , `npm:` or `https://`. ```ts title="main.ts" import { camelCase } from "jsr:@luca/cases@1.0.0"; import { say } from "npm:cowsay@1.6.0"; import { pascalCase } from "https://deno.land/x/case/mod.ts"; ``` Deno recommends [JSR](https://jsr.io), the modern JavaScript registry, for third party modules. There, you'll find plenty of well documented ES modules for your projects, including the [Deno Standard Library](/runtime/fundamentals/standard_library/). You can [read more about Deno's support for npm packages here](/runtime/fundamentals/node/#using-npm-modules). ## Managing third party modules and libraries Typing out the module name with the full version specifier can become tedious when importing them in multiple files. You can centralize management of remote modules with an `imports` field in your `deno.json` file. We call this `imports` field the **import map**, which is based on the [Import Maps Standard]. [Import Maps Standard]: https://html.spec.whatwg.org/multipage/webappapis.html#import-maps ```json title="deno.json" { "imports": { "@luca/cases": "jsr:@luca/cases@^1.0.0", "cowsay": "npm:cowsay@^1.6.0", "cases": "https://deno.land/x/case/mod.ts" } } ``` With remapped specifiers, the code looks cleaner: ```ts title="main.ts" import { camelCase } from "@luca/cases"; import { say } from "cowsay"; import { pascalCase } from "cases"; ``` The remapped name can be any valid specifier. It's a very powerful feature in Deno that can remap anything. Learn more about what the import map can do [here](/runtime/fundamentals/configuration/#dependencies). ## Differentiating between `imports` or `importMap` in `deno.json` and `--import-map` option The [Import Maps Standard] requires two entries for each module: one for the module specifier and another for the specifier with a trailing `/`. This is because the standard allows only one entry per module specifier, and the trailing `/` indicates that the specifier refers to a directory. For example, when using the `--import-map import_map.json` option, the `import_map.json` file must include both entries for each module (note the use of `jsr:/@std/async` instead of `jsr:@std/async`): ```json title="import_map.json" { "imports": { "@std/async": "jsr:@std/async@^1.0.0", "@std/async/": "jsr:/@std/async@^1.0.0/" } } ``` An `import_map.json` file referenced by the `importMap` field in `deno.json` behaves exactly the same as using the `--import-map` option, with the same requirements for including both entries for each module as shown above. In contrast, `deno.json` extends the import maps standard. When you use the imports field in `deno.json`, you only need to specify the module specifier without the trailing `/`: ```json title="deno.json" { "imports": { "@std/async": "jsr:@std/async@^1.0.0" } } ``` ## Adding dependencies with `deno add` The installation process is made easy with the `deno add` subcommand. It will automatically add the latest version of the package you requested to the `imports` section in `deno.json`. ```sh # Add the latest version of the module to deno.json $ deno add jsr:@luca/cases Add @luca/cases - jsr:@luca/cases@1.0.0 ``` ```json title="deno.json" { "imports": { "@luca/cases": "jsr:@luca/cases@^1.0.0" } } ``` You can also specify an exact version: ```sh # Passing an exact version $ deno add jsr:@luca/cases@1.0.0 Add @luca/cases - jsr:@luca/cases@1.0.0 ``` Read more in [`deno add` reference](/runtime/reference/cli/add/). You can also remove dependencies using `deno remove`: ```sh $ deno remove @luca/cases Remove @luca/cases ``` ```json title="deno.json" { "imports": {} } ``` Read more in [`deno remove` reference](/runtime/reference/cli/remove/). ## Package Versions It is possible to specify a version range for the package you are importing. This is done using the `@` symbol followed by a version range specifier, and follows the [semver](https://semver.org/) versioning scheme. For example: ```bash @scopename/mypackage # highest version @scopename/mypackage@16.1.0 # exact version @scopename/mypackage@16 # highest 16.x version >= 16.0.0 @scopename/mypackage@^16.1.0 # highest 16.x version >= 16.1.0 @scopename/mypackage@~16.1.0 # highest 16.1.x version >= 16.1.0 ``` Here is an overview of all the ways you can specify a version or a range: | Symbol | Description | Example | | --------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------- | | `1.2.3` | An exact version. Only this specific version will be used. | `1.2.3` | | `^1.2.3` | Compatible with version 1.2.3. Allows updates that do not change the leftmost non-zero digit.
    For example, `1.2.4` and `1.3.0` are allowed, but `2.0.0` is not. | `^1.2.3` | | `~1.2.3` | Approximately equivalent to version 1.2.3. Allows updates to the patch version.
    For example, `1.2.4` is allowed, but `1.3.0` is not. | `~1.2.3` | | `>=1.2.3` | Greater than or equal to version 1.2.3. Any version `1.2.3` or higher is allowed. | `>=1.2.3` | | `<=1.2.3` | Less than or equal to version 1.2.3. Any version `1.2.3` or lower is allowed. | `<=1.2.3` | | `>1.2.3` | Greater than version 1.2.3. Only versions higher than `1.2.3` are allowed. | `>1.2.3` | | `<1.2.3` | Less than version 1.2.3. Only versions lower than `1.2.3` are allowed. | `<1.2.3` | | `1.2.x` | Any patch version within the minor version 1.2. For example, `1.2.0`, `1.2.1`, etc. | `1.2.x` | | `1.x` | Any minor and patch version within the major version 1. For example, `1.0.0`, `1.1.0`, `1.2.0`, etc. | `1.x` | | `*` | Any version is allowed. | `*` | ## HTTPS imports Deno also supports import statements that reference HTTP/HTTPS URLs, either directly: ```js import { Application } from "https://deno.land/x/oak/mod.ts"; ``` or part of your `deno.json` import map: ```json { "imports": { "oak": "https://deno.land/x/oak/mod.ts" } } ``` Supporting HTTPS imports enables us to support the following JavaScript CDNs, as they provide URL access to JavaScript modules: - [deno.land/x](https://deno.land/x) - [esm.sh](https://esm.sh) - [unpkg.com](https://unpkg.com) HTTPS imports are useful if you have a small, often single file, Deno project that doesn't require any other configuration. With HTTPS imports, you can avoid having a `deno.json` file at all. It is **not** advised to use this style of import in larger applications however, as you may end up with version conflicts (where different files use different version specifiers). HTTP imports are not supported by `deno add`/`deno install` commands. :::info Use HTTPS imports with caution, and only **from trusted sources**. If the server is compromised, it could serve malicious code to your application. They can also cause versioning issues if you import different versions in different files. HTTPS imports remain supported, **but we recommend using a package registry for the best experience.** ::: ## Overriding dependencies Deno provides mechanisms to override dependencies, enabling developers to use custom or local versions of libraries during development or testing. Note: If you need to cache and modify dependencies locally for use across builds, consider [vendoring remote modules](#vendoring-remote-modules). ### Overriding local JSR packages For developers familiar with `npm link` in Node.js, Deno provides a similar feature for local JSR packages through the `patch` field in `deno.json`. This allows you to override dependencies with local versions during development without needing to publish them. Example: ```json title="deno.json" { "patch": [ "../some-package-or-workspace" ] } ``` Key points: - The `patch` field accepts paths to directories containing JSR packages or workspaces. If you reference a single package within a workspace, the entire workspace will be included. - This feature is only respected in the workspace root. Using `patch` elsewhere will trigger warnings. - Currently, `patch` is limited to JSR packages. Attempting to patch `npm` packages will result in a warning with no effect. Limitations: - `npm` package overrides are not supported yet. This is planned for future updates. - Git-based dependency overrides are unavailable. - The `patch` field requires proper configuration in the workspace root. - This feature is experimental and may change based on user feedback. ### Overriding NPM packages Deno supports patching npm packages with local versions, similar to how JSR packages can be patched. This allows you to use a local copy of an npm package during development without publishing it. To use a local npm package, configure the `patch` field in your `deno.json`: ```json { "patch": [ "../path/to/local_npm_package" ], "unstable": ["npm-patch"] } ``` This feature requires a `node_modules` directory and has different behaviors depending on your `nodeModulesDir` setting: - With `"nodeModulesDir": "auto"`: The directory is recreated on each run, which slightly increases startup time but ensures the latest version is always used. - With `"nodeModulesDir": "manual"` (default when using package.json): You must run `deno install` after updating the package to get the changes into the workspace's `node_modules` directory. Limitations: - Specifying a local copy of an npm package or changing its dependencies will purge npm packages from the lockfile, which may cause npm resolution to work differently. - The npm package name must exist in the registry, even if you're using a local copy. - This feature is currently behind the `unstable` flag. ### Overriding HTTPS imports Deno also allows overriding HTTPS imports through the `scopes` field in `deno.json`. This feature is particularly useful when substituting a remote dependency with a local patched version for debugging or temporary fixes. Example: ```json title="deno.json" { "imports": { "example/": "https://deno.land/x/example/" }, "scopes": { "https://deno.land/x/example/": { "https://deno.land/x/my-library@1.0.0/mod.ts": "./patched/mod.ts" } } } ``` Key points: - The `scopes` field in the import map allows you to redirect specific imports to alternative paths. - This is commonly used to override remote dependencies with local files for testing or development purposes. - Scopes apply only to the root of your project. Nested scopes within dependencies are ignored. ## Vendoring remote modules If your project has external dependencies, you may want to store them locally to avoid downloading them from the internet every time you build your project. This is especially useful when building your project on a CI server or in a Docker container, or patching or otherwise modifying the remote dependencies. Deno offers this functionality through a setting in your `deno.json` file: ```json { "vendor": true } ``` Add the above snippet to your `deno.json` file and Deno will cache all dependencies locally in a `vendor` directory when the project is run, or you can optionally run the `deno install --entrypoint` command to cache the dependencies immediately: ```bash deno install --entrypoint main.ts ``` You can then run the application as usual with `deno run`: ```bash deno run main.ts ``` After vendoring, you can run `main.ts` without internet access by using the `--cached-only` flag, which forces Deno to use only locally available modules. For more advanced overrides, such as substituting dependencies during development, see [Overriding dependencies](#overriding-dependencies). ## Publishing modules Any Deno program that defines an export can be published as a module. This allows other developers to import and use your code in their own projects. Modules can be published to: - [JSR](https://jsr.io) - recommended, supports TypeScript natively and auto-generates documentation for you - [npm](https://www.npmjs.com/) - use [dnt](https://github.com/denoland/dnt) to create the npm package - [deno.land/x](https://deno.com/add_module) - for HTTPS imports, use JSR instead if possible ## Reloading modules By default, Deno uses a global cache directory (`DENO_DIR`) for downloaded dependencies. This cache is shared across all projects. You can force deno to refetch and recompile modules into the cache using the `--reload` flag. ```bash # Reload everything deno run --reload my_module.ts # Reload a specific module deno run --reload=jsr:@std/fs my_module.ts ``` ## Development only dependencies Sometimes dependencies are only needed during development, for example dependencies of test files or build tools. In Deno, the runtime does not require you to distinguish between development and production dependencies, as the [runtime will only load and install dependencies that are actually used in the code that is being executed](#why-does-deno-not-have-a-devimports-field). However, it can be useful to mark dev dependencies to aid people who are reading your package. When using `deno.json`, the convention is to add a `// dev` comment after any "dev only" dependency: ```json title="deno.json" { "imports": { "@std/fs": "jsr:@std/fs@1", "@std/testing": "jsr:@std/testing@1" // dev } } ``` When using a `package.json` file, dev dependencies can be added to the separate `devDependencies` field: ```json title="package.json" { "dependencies": { "pg": "npm:pg@^8.0.0" }, "devDependencies": { "prettier": "^3" } } ``` ### Why does Deno not have a `devImports` field? To understand why Deno does not separate out dev dependencies in the package manifest it is important to understand what problem dev dependencies are trying to solve. When deploying an application you frequently want to install only the dependencies that are actually used in the code that is being executed. This helps speed up startup time and reduce the size of the deployed application. Historically, this has been done by separating out dev dependencies into a `devDependencies` field in the `package.json`. When deploying an application, the `devDependencies` are not installed, and only the dependencies. This approach has shown to be problematic in practice. It is easy to forget to move a dependency from `dependencies` to `devDependencies` when a dependency moves from being a runtime to a dev dependency. Additionally, some packages that are semantically "development time" dependencies, like (`@types/*`), are often defined in `dependencies` in `package.json` files, which means they are installed for production even though they are not needed. Because of this, Deno uses a different approach for installing production only dependencies: when running `deno install`, you can pass a `--entrypoint` flag that causes Deno to install only the dependencies that are actually (transitively) imported by the specified entrypoint file. Because this is automatic, and works based on the actual code that is being executed, there is no need to specify development dependencies in a separate field. ## Using only cached modules To force Deno to only use modules that have previously been cached, use the `--cached-only` flag: ```shell deno run --cached-only mod.ts ``` This will fail if there are any dependencies in the dependency tree for mod.ts which are not yet cached. ## Integrity Checking and Lock Files Imagine your module relies on a remote module located at https://some.url/a.ts. When you compile your module for the first time, `a.ts` is fetched, compiled, and cached. This cached version will be used until you either run your module on a different machine (such as in a production environment) or manually reload the cache (using a command like `deno install --reload`). But what if the content at `https://some.url/a.ts` changes? This could result in your production module running with different dependency code than your local module. To detect this, Deno uses integrity checking and lock files. Deno uses a `deno.lock` file to check external module integrity. To opt into a lock file, either: 1. Create a `deno.json` file in the current or an ancestor directory, which will automatically create an additive lockfile at `deno.lock`. Note that this can be disabled by specifying the following in your deno.json: ```json title="deno.json" { "lock": false } ``` 2. Use the `--lock` flag to enable and specify lock file checking. ### Frozen lockfile By default, Deno uses an additive lockfile, where new dependencies are added to the lockfile instead of erroring. This might not be desired in certain scenarios (ex. CI pipelines or production environments) where you'd rather have Deno error when it encounters a dependency it's never seen before. To enable this, you can specify the `--frozen` flag or set the following in a deno.json file: ```json title="deno.json" { "lock": { "frozen": true } } ``` When running a deno command with a frozen lockfile, any attempts to update the lockfile with new contents will cause the command to exit with an error showing the modifications that would have been made. If you wish to update the lockfile, specify `--frozen=false` on the command line to temporarily disable the frozen lockfile. ### Changing lockfile path The lockfile path can be configured by specifying `--lock=deps.lock` or the following in a Deno configuration file: ```json title="deno.json" { "lock": { "path": "deps.lock" } } ``` ## Private repositories :::note If you're looking for private npm registries and `.npmrc` support, visit the [npm support](/runtime/fundamentals/node/#private-registries) page. ::: There may be instances where you want to load a remote module that is located in a _private_ repository, like a private repository on GitHub. Deno supports sending bearer tokens when requesting a remote module. Bearer tokens are the predominant type of access token used with OAuth 2.0, and are broadly supported by hosting services (e.g., GitHub, GitLab, Bitbucket, Cloudsmith, etc.). ### DENO_AUTH_TOKENS The Deno CLI will look for an environment variable named `DENO_AUTH_TOKENS` to determine what authentication tokens it should consider using when requesting remote modules. The value of the environment variable is in the format of _n_ number of tokens delimited by a semi-colon (`;`) where each token is either: - a bearer token in the format of `{token}@{hostname[:port]}` or - basic auth data in the format of `{username}:{password}@{hostname[:port]}` For example, a single token for `deno.land` would look something like this: ```sh DENO_AUTH_TOKENS=a1b2c3d4e5f6@deno.land ``` or: ```sh DENO_AUTH_TOKENS=username:password@deno.land ``` And multiple tokens would look like this: ```sh DENO_AUTH_TOKENS=a1b2c3d4e5f6@deno.land;f1e2d3c4b5a6@example.com:8080;username:password@deno.land ``` When Deno goes to fetch a remote module, where the hostname matches the hostname of the remote module, Deno will set the `Authorization` header of the request to the value of `Bearer {token}` or `Basic {base64EncodedData}`. This allows the remote server to recognize that the request is an authorized request tied to a specific authenticated user, and provide access to the appropriate resources and modules on the server. ### GitHub To access private repositories on GitHub, you would need to issue yourself a _personal access token_. You do this by logging into GitHub and going under _Settings -> Developer settings -> Personal access tokens_: ![Personal access tokens settings on GitHub](./images/private-pat.png) You would then choose to _Generate new token_ and give your token a description and appropriate access to the `repo` scope. The `repo` scope will enable reading file contents (more on [scopes in the GitHub docs](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/scopes-for-oauth-apps#available-scopes)): ![Creating a new personal access token on GitHub](./images/private-github-new-token.png) And once created GitHub will display the new token a single time, the value of which you would want to use in the environment variable: ![Display of newly created token on GitHub](./images/private-github-token-display.png) In order to access modules that are contained in a private repository on GitHub, you would want to use the generated token in the `DENO_AUTH_TOKENS` environment variable scoped to the `raw.githubusercontent.com` hostname. For example: ```sh DENO_AUTH_TOKENS=a1b2c3d4e5f6@raw.githubusercontent.com ``` This should allow Deno to access any modules that the user who the token was issued for has access to. When the token is incorrect, or the user does not have access to the module, GitHub will issue a `404 Not Found` status, instead of an unauthorized status. So if you are getting errors that the modules you are trying to access are not found on the command line, check the environment variable settings and the personal access token settings. In addition, `deno run -L debug` should print out a debug message about the number of tokens that are parsed out of the environment variable. It will print an error message if it feels any of the tokens are malformed. It won't print any details about the tokens for security purposes. --- # Node and npm Compatibility > Guide to using Node.js modules and npm packages in Deno. Learn about compatibility features, importing npm packages, and differences between Node.js and Deno environments. URL: https://docs.deno.com/runtime/fundamentals/node - **Deno is Node-compatible**. Most Node projects will run in Deno with little or no change! - **Deno supports npm packages**. Just use the `npm:` specifier in the import, and Deno takes care of the rest. For example, here's how you'd import Hono from npm in a Deno project: ```ts import { Hono } from "npm:hono"; ``` That's all you really need to know to get started! However, there are some key differences between the two runtimes that you can take advantage of to make your code simpler and smaller when migrating your Node.js projects to Deno. ## Using Node's built-in modules Deno provides a compatibility layer that allows the use of Node.js built-in APIs within Deno programs. However, in order to use them, you will need to add the `node:` specifier to any import statements that use them: ```js title=main.mjs import * as os from "node:os"; console.log(os.cpus()); ``` And run it with `deno run main.mjs` - you will notice you get the same output as running the program in Node.js. Updating any imports in your application to use `node:` specifiers should enable any code using Node built-ins to function as it did in Node.js. To make updating existing code easier, Deno will provide helpful hints for imports that don't use `node:` prefix: ```js title="main.mjs" import * as os from "os"; console.log(os.cpus()); ``` ```sh $ deno run main.mjs error: Relative import path "os" not prefixed with / or ./ or ../ hint: If you want to use a built-in Node module, add a "node:" prefix (ex. "node:os"). at file:///main.mjs:1:21 ``` The same hints and additional quick-fixes are provided by the Deno LSP in your editor. Explore built-in Node APIs ## Using npm packages Deno has native support for importing npm packages by using `npm:` specifiers. For example: ```ts title="main.js" import * as emoji from "npm:node-emoji"; console.log(emoji.emojify(`:sauropod: :heart: npm`)); ``` Can be run with: ```sh $ deno run main.js 🦕 ❤️ npm ``` No `npm install` is necessary before the `deno run` command and no `node_modules` folder is created. These packages are also subject to the same [permissions](/runtime/fundamentals/security/) as other code in Deno. npm specifiers have the following format: ```console npm:[@][/] ``` This also allows functionality that may be familar from the `npx` command. ```console # npx allows remote execution of a package from npm or a URL $ npx create-next-app@latest # deno run allows remote execution of a package from various locations, # and can scoped to npm via the `npm:` specifier. $ deno run -A npm:create-next-app@latest ``` For examples with popular libraries, please refer to the [tutorial section](/runtime/tutorials). ## CommonJS support CommonJS is a module system that predates [ES modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules). While we firmly believe that ES modules are the future of JavaScript, there are millions of npm libraries that are written in CommonJS and Deno offers full support for them. Deno will automatically determine if a package is using CommonJS and make it work seamlessly when imported: ```js title="main.js" import react from "npm:react"; console.log(react); ``` ```shell $ deno run -E main.js 18.3.1 ``` _`npm:react` is a CommonJS package. Deno allows you to import it as if it were an ES module._ Deno strongly encourages the use of ES modules in your code but offers CommonJS support with following restrictions: **Deno's permission system is still in effect when using CommonJS modules.** It may be necessary to provide at least `--allow-read` permission as Deno will probe the file system for `package.json` files and `node_modules` directory to properly resolve CommonJS modules. ### Use .cjs extension If the file extension is `.cjs` Deno will treat this module as CommonJS. ```js title="main.cjs" const express = require("express"); ``` Deno does not look for `package.json` files and `type` option to determine if the file is CommonJS or ESM. When using CommonJS, Deno expects that dependencies will be installed manually and a `node_modules` directory will be present. It's best to set `"nodeModulesDir": "auto"` in your `deno.json` to ensure that. ```shell $ cat deno.json { "nodeModulesDir": "auto" } $ deno install npm:express Add npm:express@5.0.0 $ deno run -R -E main.cjs [Function: createApplication] { application: { init: [Function: init], defaultConfiguration: [Function: defaultConfiguration], ... } } ``` `-R` and `-E` flags are used to allow permissions to read files and environment variables. ### package.json type option Deno will attempt to load `.js`, `.jsx`, `.ts`, and `.tsx` files as CommonJS if there's a `package.json` file with `"type": "commonjs"` option next to the file, or up in the directory tree when in a project with a package.json file. ```json title="package.json" { "type": "commonjs" } ``` ```js title="main.js" const express = require("express"); ``` Tools like Next.js's bundler and others will generate a `package.json` file like that automatically. If you have an existing project that uses CommonJS modules, you can make it work with both Node.js and Deno, by adding `"type": "commonjs"` option to the `package.json` file. ### Always detecting if a file might be CommonJS Telling Deno to analyze modules as possibly being CommonJS is possible by running with the `--unstable-detect-cjs` in Deno >= 2.1.2. This will take effect, except when there's a _package.json_ file with `{ "type": "module" }`. Looking for package.json files on the file system and analyzing a module to detect if its CommonJS takes longer than not doing it. For this reason and to discourage the use of CommonJS, Deno does not do this behavior by default. ### Create require() manually An alternative option is to create an instance of the `require()` function manually: ```js title="main.js" import { createRequire } from "node:module"; const require = createRequire(import.meta.url); const express = require("express"); ``` In this scenario the same requirements apply, as when running `.cjs` files - dependencies need to be installed manually and appropriate permission flags given. ### require(ESM) Deno's `require()` implementation supports requiring ES modules. This works the same as in Node.js, where you can only `require()` ES modules that don't have Top-Level Await in their module graph - or in other words you can only `require()` ES modules that are "synchronous". ```js title="greet.js" export function greet(name) { return `Hello ${name}`; } ``` ```js title="esm.js" import { greet } from "./greet.js"; export { greet }; ``` ```js title="main.cjs" const esm = require("./esm"); console.log(esm); console.log(esm.greet("Deno")); ``` ```shell $ deno run -R main.cjs [Module: null prototype] { greet: [Function: greet] } Hello Deno ``` ### Import CommonJS modules You can also import CommonJS files in ES modules. ```js title="greet.cjs" module.exports = { hello: "world", }; ``` ```js title="main.js" import greet from "./greet.js"; console.log(greet); ``` ```shell $ deno run main.js { "hello": "world" } ``` **Hints and suggestions** Deno will provide useful hints and suggestions to guide you towards working code when working with CommonJS modules. As an example, if you try to run a CommonJS module that doesn't have `.cjs` extension or doesn't have a `package.json` with `{ "type": "commonjs" }` you might see this: ```js title="main.js" module.exports = { hello: "world", }; ``` ```shell $ deno run main.js error: Uncaught (in promise) ReferenceError: module is not defined module.exports = { ^ at file:///main.js:1:1 info: Deno supports CommonJS modules in .cjs files, or when the closest package.json has a "type": "commonjs" option. hint: Rewrite this module to ESM, or change the file extension to .cjs, or add package.json next to the file with "type": "commonjs" option, or pass --unstable-detect-cjs flag to detect CommonJS when loading. docs: https://docs.deno.com/go/commonjs ``` ## Conditional exports Package exports can be [conditioned](https://nodejs.org/api/packages.html#conditional-exports) on the resolution mode. The conditions satisfied by an import from a Deno ESM module are as follows: ```json ["deno", "node", "import", "default"] ``` This means that the first condition listed in a package export whose key equals any of these strings will be matched. You can expand this list using the `--unstable-node-conditions` CLI flag: ```shell deno run --unstable-node-conditions development,react-server main.ts ``` ```json ["development", "react-server", "deno", "node", "import", "default"] ``` ## Importing types Many npm packages ship with types, you can import these and use them with types directly: ```ts import chalk from "npm:chalk@5"; ``` Some packages do not ship with types but you can specify their types with the [`@ts-types`](/runtime/fundamentals/typescript) directive. For example, using a [`@types`](https://www.typescriptlang.org/docs/handbook/2/type-declarations.html#definitelytyped--types) package: ```ts // @ts-types="npm:@types/express@^4.17" import express from "npm:express@^4.17"; ``` **Module resolution** The official TypeScript compiler `tsc` supports different [moduleResolution](https://www.typescriptlang.org/tsconfig#moduleResolution) settings. Deno only supports the modern `node16` resolution. Unfortunately many npm packages fail to correctly provide types under node16 module resolution, which can result in `deno check` reporting type errors, that `tsc` does not report. If a default export from an `npm:` import appears to have a wrong type (with the right type seemingly being available under the `.default` property), it's most likely that the package provides wrong types under node16 module resolution for imports from ESM. You can verify this by checking if the error also occurs with `tsc --module node16` and `"type": "module"` in `package.json` or by consulting the [Are the types wrong?](https://arethetypeswrong.github.io/) website (particularly the "node16 from ESM" row). If you want to use a package that doesn't support TypeScript's node16 module resolution, you can: 1. Open an issue at the issue tracker of the package about the problem. (And perhaps contribute a fix :) (Although, unfortunately, there is a lack of tooling for packages to support both ESM and CJS, since default exports require different syntaxes. See also [microsoft/TypeScript#54593](https://github.com/microsoft/TypeScript/issues/54593)) 2. Use a [CDN](/runtime/fundamentals/modules/#url_imports), that rebuilds the packages for Deno support, instead of an `npm:` identifier. 3. Ignore the type errors you get in your code base with `// @ts-expect-error` or `// @ts-ignore`. ## Including Node types Node ships with many built-in types like `Buffer` that might be referenced in an npm package's types. To load these you must add a types reference directive to the `@types/node` package: ```ts /// ``` Note that it is fine to not specify a version for this in most cases because Deno will try to keep it in sync with its internal Node code, but you can always override the version used if necessary. ## Executable npm scripts npm packages with `bin` entries can be executed from the command line without an `npm install` using a specifier in the following format: ```console npm:[@][/] ``` For example: ```sh $ deno run --allow-read npm:cowsay@1.5.0 "Hello there!" ______________ < Hello there! > -------------- \ ^__^ \ (oo)\_______ (__)\ )\/\ ||----w | || || $ deno run --allow-read npm:cowsay@1.5.0/cowthink "What to eat?" ______________ ( What to eat? ) -------------- o ^__^ o (oo)\_______ (__)\ )\/\ ||----w | || || ``` ## node_modules When you run `npm install`, npm creates a `node_modules` directory in your project which houses the dependencies as specified in the `package.json` file. Deno uses [npm specifiers](/runtime/fundamentals/node/#using-npm-packages) to resolve npm packages to a central global npm cache, instead of using a `node_modules` folder in your projects. This is ideal since it uses less space and keeps your project directory clean. There may however be cases where you need a local `node_modules` directory in your Deno project, even if you don’t have a `package.json` (eg. when using frameworks like Next.js or Svelte or when depending on npm packages that use Node-API). #### Default Deno dependencies behavior By default, Deno will not create a `node_modules` directory when you use the `deno run` command, dependencies will be installed into the global cache. This is the recommended setup for new Deno projects. #### Automatic node_modules creation If you need a `node_modules` directory in your project, you can use the `--node-modules-dir` flag or `nodeModulesDir: auto` option in the config file to tell Deno to create a `node_modules` directory in the current working directory: ```sh deno run --node-modules-dir=auto main.ts ``` or with a configuration file: ```json title="deno.json" { "nodeModulesDir": "auto" } ``` The auto mode automatically installs dependencies into the global cache and creates a local node_modules directory in the project root. This is recommended for projects that have npm dependencies that rely on node_modules directory - mostly projects using bundlers or ones that have npm dependencies with postinstall scripts. #### Manual node_modules creation If your project has a `package.json` file, you can use the manual mode, which requires an installation step to create your `node_modules` directory: ```sh deno install deno run --node-modules-dir=manual main.ts ``` or with a configuration file: ```json title="deno.json" { "nodeModulesDir": "manual" } ``` You would then run `deno install/npm install/pnpm install` or any other package manager to create the `node_modules` directory. Manual mode is the default mode for projects using a `package.json`. You may recognize this workflow from Node.js projects. It is recommended for projects using frameworks like Next.js, Remix, Svelte, Qwik etc, or tools like Vite, Parcel or Rollup. :::note We recommend that you use the default `none` mode, and fallback to `auto` or `manual` mode if you get errors about missing packages inside the `node_modules` directory. ::: #### node_modules with Deno 1.X Use the `--node-modules-dir` flag. For example, given `main.ts`: ```ts import chalk from "npm:chalk@5"; console.log(chalk.green("Hello")); ``` ```sh deno run --node-modules-dir main.ts ``` Running the above command, with a `--node-modules-dir` flag, will create a `node_modules` folder in the current directory with a similar folder structure to npm. ## Node.js global objects In Node.js, there are a number of [global objects](https://nodejs.org/api/globals.html) available in the scope of all programs that are specific to Node.js, eg. `process` object. Here are a few globals that you might encounter in the wild and how to use them in Deno: - `process` - Deno provides the `process` global, which is by far the most popular global used in popular npm packages. It is available to all code. However, Deno will guide you towards importing it explicitly from `node:process` module by providing lint warnings and quick-fixes: ```js title="process.js" console.log(process.versions.deno); ``` ```shell $ deno run process.js 2.0.0 $ deno lint process.js error[no-process-global]: NodeJS process global is discouraged in Deno --> /process.js:1:13 | 1 | console.log(process.versions.deno); | ^^^^^^^ = hint: Add `import process from "node:process";` docs: https://docs.deno.com/lint/rules/no-process-global Found 1 problem (1 fixable via --fix) Checked 1 file ``` - `require()` - see [CommonJS support](#commonjs-support) - `Buffer` - to use `Buffer` API it needs to be explicitly imported from the `node:buffer` module: ```js title="buffer.js" import { Buffer } from "node:buffer"; const buf = new Buffer(5, "0"); ``` For TypeScript users needing Node.js-specific types like `BufferEncoding`, these are available through the `NodeJS` namespace when using `@types/node`: ```ts title="buffer-types.ts" /// // Now you can use NodeJS namespace types function writeToBuffer(data: string, encoding: NodeJS.BufferEncoding): Buffer { return Buffer.from(data, encoding); } ``` Prefer using [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) or other [`TypedArray`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray) subclasses instead. - `__filename` - use `import.meta.filename` instead. - `__dirname` - use `import.meta.dirname` instead. ## Node-API addons Deno supports [Node-API addons](https://nodejs.org/api/n-api.html) that are used by popular npm packages like [`esbuild`](https://www.npmjs.com/package/esbuild), [`npm:sqlite3`](https://www.npmjs.com/package/sqlite3) or [`npm:duckdb`](https://www.npmjs.com/package/duckdb). You can expect all packages that use public and documented Node-APIs to work. :::info Most packages using Node-API addons rely on npm "lifecycle scripts", like `postinstall`. While Deno supports them, they are not run by default due to security considerations. Read more in [`deno install` docs](/runtime/reference/cli/install/). ::: As of Deno 2.0, npm packages using Node-API addons **are only supported when a `node_modules/` directory is present**. Add `"nodeModulesDir": "auto"` or `"nodeModulesDir": "manual"` setting your `deno.json` file, or run with `--node-modules-dir=auto|manual` flag to ensure these packages work correctly. In case of misconfiguration Deno will provide hints how the situation can be resolved. ## Migrating from Node to Deno Running your Node.js project with Deno is a straightforward process. In most cases you can expect little to no changes to be required, if your project is written using ES modules. Main points to be aware of, include: 1. Importing Node.js built-in modules requires the `node:` specifier: ```js // ❌ import * as fs from "fs"; import * as http from "http"; // ✅ import * as fs from "node:fs"; import * as http from "node:http"; ``` :::tip It is recommended to change these import specifiers in your existing project anyway. This is a recommended way to import them in Node.js too. ::: 2. Some [globals available in Node.js](#nodejs-global-objects) need to be explicitly imported, eg. `Buffer`: ```js import { Buffer } from "node:buffer"; ``` 3. `require()` is only available in files with `.cjs` extension, in other files an instance of `require()` [needs to be created manually](#nodejs-global-objects). npm dependencies can use `require()` regardless of file extension. ### Running scripts Deno supports running npm scripts natively with the [`deno task`](/runtime/reference/cli/task_runner/) subcommand (If you're migrating from Node.js, this is similar to the `npm run script` command). Consider the following Node.js project with a script called `start` inside its `package.json`: ```json title="package.json" { "name": "my-project", "scripts": { "start": "eslint" } } ``` You can execute this script with Deno by running: ```sh deno task start ``` ### Optional improvements One of Deno's core strengths is a unified toolchain that comes with support for TypeScript out of the box, and tools like a linter, formatter and a test runner. Switching to Deno allows you to simplify your toolchain and reduces the number of moving components in your project. **Configuration** Deno has its own config file, `deno.json` or `deno.jsonc`, which can be used to [configure your project](/runtime/fundamentals/configuration/) You can use it to [define dependencies](/runtime/fundamentals/configuration/) using the `imports` option - you can migrate your dependencies one-by-one from `package.json`, or elect to not define them in the config file at all and use `npm:` specifiers inline in your code. In addition to specifying dependencies you can use `deno.json` to define tasks, lint and format options, path mappings, and other runtime configurations. **Linting** Deno ships with a built-in linter that is written with performance in mind. It's similar to ESlint, though with a limited number of rules. If you don't rely on ESLint plugins, you can drop `eslint` dependency from `devDependencies` section of `package.json` and use `deno lint` instead. Deno can lint large projects in just a few milliseconds. You can try it out on your project by running: ```sh deno lint ``` This will lint all files in your project. When the linter detects a problem, it will show the line in your editor and in the terminal output. An example of what that might look like: ```sh error[no-constant-condition]: Use of a constant expressions as conditions is not allowed. --> /my-project/bar.ts:1:5 | 1 | if (true) { | ^^^^ = hint: Remove the constant expression docs: https://docs.deno.com/lint/rules/no-constant-condition Found 1 problem Checked 4 files ``` Many linting issues can be fixed automatically by passing the `--fix` flag: ```sh deno lint --fix ``` A full list of all supported linting rules can be found on [https://docs.deno.com/lint/](https://docs.deno.com/lint/). To learn more about how to configure the linter, check out the [`deno lint` subcommand](/runtime/reference/cli/linter/). **Formatting** Deno ships with a [built-in formatter](/runtime/reference/cli/formatter/) that can optionally format your code according to the Deno style guide. Instead of adding `prettier` to your `devDependencies` you can instead use Deno's built-in zero-config code formatter `deno fmt`. You can run the formatter on your project by running: ```sh deno fmt ``` If using `deno fmt` in CI, you can pass the `--check` argument to make the formatter exit with an error when it detects improperly formatted code. ```sh deno fmt --check ``` The formatting rules can be configured in your `deno.json` file. To learn more about how to configure the formatter, check out the [`deno fmt` subcommand](/runtime/reference/cli/formatter/). **Testing** Deno encourages writing tests for your code, and provides a built-in test runner to make it easy to write and run tests. The test runner is tightly integrated into Deno, so that you don't have to do any additional configuration to make TypeScript or other features work. ```ts title="my_test.ts" Deno.test("my test", () => { // Your test code here }); ``` ```sh deno test ``` When passing the `--watch` flag, the test runner will automatically reload when any of the imported modules change. To learn more about the test runner and how to configure it, check out the [`deno test` subcommand](/runtime/reference/cli/test/) documentation. ## Private registries :::caution Not to be confused with [private repositories and modules](/runtime/fundamentals/modules/#private-repositories). ::: Deno supports private registries, which allow you to host and share your own modules. This is useful for organizations that want to keep their code private or for individuals who want to share their code with a select group of people. ### What are private registries? Large organizations often host their own private npm registries to manage internal packages securely. These private registries serve as repositories where organizations can publish and store their proprietary or custom packages. Unlike public npm registries, private registries are accessible only to authorized users within the organization. ### How to use private registries with Deno First, configure your [`.npmrc`](https://docs.npmjs.com/cli/v10/configuring-npm/npmrc) file to point to your private registry. The `.npmrc` file must be in the project root or `$HOME` directory. Add the following to your `.npmrc` file: ```sh @mycompany:registry=http://mycompany.com:8111/ //mycompany.com:8111/:_auth=secretToken ``` Replace `http://mycompany.com:8111/` with the actual URL of your private registry and `secretToken` with your authentication token. Then update Your `deno.json` or `package.json` to specify the import path for your private package. For example: ```json title="deno.json" { "imports": { "@mycompany/package": "npm:@mycompany/package@1.0.0" } } ``` or if you're using a `package.json`: ```json title="package.json" { "dependencies": { "@mycompany/package": "1.0.0" } } ``` Now you can import your private package in your Deno code: ```typescript title="main.ts" import { hello } from "@mycompany/package"; console.log(hello()); ``` and run it using the `deno run` command: ```sh deno run main.ts ``` ## Node to Deno Cheatsheet | Node.js | Deno | | -------------------------------------- | ----------------------------- | | `node file.js` | `deno file.js` | | `ts-node file.ts` | `deno file.ts` | | `nodemon` | `deno run --watch` | | `node -e` | `deno eval` | | `npm i` / `npm install` | `deno install` | | `npm install -g` | `deno install -g` | | `npm run` | `deno task` | | `eslint` | `deno lint` | | `prettier` | `deno fmt` | | `package.json` | `deno.json` or `package.json` | | `tsc` | `deno check` ¹ | | `typedoc` | `deno doc` | | `jest` / `ava` / `mocha` / `tap` / etc | `deno test` | | `nexe` / `pkg` | `deno compile` | | `npm explain` | `deno info` | | `nvm` / `n` / `fnm` | `deno upgrade` | | `tsserver` | `deno lsp` | | `nyc` / `c8` / `istanbul` | `deno coverage` | | benchmarks | `deno bench` | ¹ Type checking happens automatically, TypeScript compiler is built into the `deno` binary. --- # runtime/fundamentals/open_telemetry.md > Learn how to implement observability in Deno applications using OpenTelemetry. Covers tracing, metrics collection, and integration with monitoring systems. URL: https://docs.deno.com/runtime/fundamentals/open_telemetry :::caution The OpenTelemetry integration for Deno is still in development and may change. To use it, you must pass the `--unstable-otel` flag to Deno. ::: Deno has built in support for [OpenTelemetry](https://opentelemetry.io/). > OpenTelemetry is a collection of APIs, SDKs, and tools. Use it to instrument, > generate, collect, and export telemetry data (metrics, logs, and traces) to > help you analyze your software’s performance and behavior. > > - https://opentelemetry.io/ This integration enables you to monitor your Deno applications using OpenTelemetry observability tooling with instruments like logs, metrics, and traces. Deno provides the following features: - Exporting of collected metrics, traces, and logs to a server using the OpenTelemetry protocol. - [Automatic instrumentation](#auto-instrumentation) of the Deno runtime with OpenTelemetry metrics, traces, and logs. - [Collection of user defined metrics, traces, and logs](#user-metrics) created with the `npm:@opentelemetry/api` package. ## Quick start To enable the OpenTelemetry integration, run your Deno script with the `--unstable-otel` flag and set the environment variable `OTEL_DENO=true`: ```sh OTEL_DENO=true deno run --unstable-otel my_script.ts ``` This will automatically collect and export runtime observability data to an OpenTelemetry endpoint at `localhost:4318` using Protobuf over HTTP (`http/protobuf`). :::tip If you do not have an OpenTelemetry collector set up yet, you can get started with a [local LGTM stack in Docker](https://github.com/grafana/docker-otel-lgtm/tree/main?tab=readme-ov-file) (Loki (logs), Grafana (dashboard), Tempo (traces), and Prometheus (metrics)) by running the following command: ```sh docker run --name lgtm -p 3000:3000 -p 4317:4317 -p 4318:4318 --rm -ti \ -v "$PWD"/lgtm/grafana:/data/grafana \ -v "$PWD"/lgtm/prometheus:/data/prometheus \ -v "$PWD"/lgtm/loki:/data/loki \ -e GF_PATHS_DATA=/data/grafana \ docker.io/grafana/otel-lgtm:0.8.1 ``` You can then access the Grafana dashboard at `http://localhost:3000` with the username `admin` and password `admin`. ::: This will automatically collect and export runtime observability data like `console.log`, traces for HTTP requests, and metrics for the Deno runtime. [Learn more about auto instrumentation](#auto-instrumentation). You can also create your own metrics, traces, and logs using the `npm:@opentelemetry/api` package. [Learn more about user defined metrics](#user-metrics). ## Auto instrumentation Deno automatically collects and exports some observability data to the OTLP endpoint. This data is exported in the built-in instrumentation scope of the Deno runtime. This scope has the name `deno`. The version of the Deno runtime is the version of the `deno` instrumentation scope. (e.g. `deno:2.1.4`). ### Traces Deno automatically creates spans for various operations, such as: - Incoming HTTP requests served with `Deno.serve`. - Outgoing HTTP requests made with `fetch`. #### `Deno.serve` When you use `Deno.serve` to create an HTTP server, a span is created for each incoming request. The span automatically ends when response headers are sent (not when the response body is done sending). The name of the created span is `${method}`. The span kind is `server`. The following attributes are automatically added to the span on creation: - `http.request.method`: The HTTP method of the request. - `url.full`: The full URL of the request (as would be reported by `req.url`). - `url.scheme`: The scheme of the request URL (e.g. `http` or `https`). - `url.path`: The path of the request URL. - `url.query`: The query string of the request URL. After the request is handled, the following attributes are added: - `http.response.status_code`: The status code of the response. Deno does not automatically add a `http.route` attribute to the span as the route is not known by the runtime, and instead is determined by the routing logic in a user's handler function. If you want to add a `http.route` attribute to the span, you can do so in your handler function using `npm:@opentelemetry/api`. In this case you should also update the span name to include the route. ```ts import { trace } from "npm:@opentelemetry/api@1"; const INDEX_ROUTE = new URLPattern({ pathname: "/" }); const BOOK_ROUTE = new URLPattern({ pathname: "/book/:id" }); Deno.serve(async (req) => { const span = trace.getActiveSpan(); if (INDEX_ROUTE.test(req.url)) { span.setAttribute("http.route", "/"); span.updateName(`${req.method} /`); // handle index route } else if (BOOK_ROUTE.test(req.url)) { span.setAttribute("http.route", "/book/:id"); span.updateName(`${req.method} /book/:id`); // handle book route } else { return new Response("Not found", { status: 404 }); } }); ``` #### `fetch` When you use `fetch` to make an HTTP request, a span is created for the request. The span automatically ends when the response headers are received. The name of the created span is `${method}`. The span kind is `client`. The following attributes are automatically added to the span on creation: - `http.request.method`: The HTTP method of the request. - `url.full`: The full URL of the request. - `url.scheme`: The scheme of the request URL. - `url.path`: The path of the request URL. - `url.query`: The query string of the request URL. After the response is received, the following attributes are added: - `http.status_code`: The status code of the response. ### Metrics The following metrics are automatically collected and exported: #### `Deno.serve` / `Deno.serveHttp` ##### `http.server.request.duration` A histogram of the duration of incoming HTTP requests served with `Deno.serve` or `Deno.serveHttp`. The time that is measured is from when the request is received to when the response headers are sent. This does not include the time to send the response body. The unit of this metric is seconds. The histogram buckets are `[0.005, 0.01, 0.025, 0.05, 0.075, 0.1, 0.25, 0.5, 0.75, 1.0, 2.5, 5.0, 7.5, 10.0]`. This metric is recorded with the following attributes: - `http.request.method`: The HTTP method of the request. - `url.scheme`: The scheme of the request URL. - `network.protocol.version`: The version of the HTTP protocol used for the request (e.g. `1.1` or `2`). - `server.address`: The address that the server is listening on. - `server.port`: The port that the server is listening on. - `http.response.status_code`: The status code of the response (if the request has been handled without a fatal error). - `error.type`: The type of error that occurred (if the request handling was subject to an error). ##### `http.server.active_requests` A gauge of the number of active requests being handled by `Deno.serve` or `Deno.serveHttp` at any given time. This is the number of requests that have been received but not yet responded to (where the response headers have not yet been sent). This metric is recorded with the following attributes: - `http.request.method`: The HTTP method of the request. - `url.scheme`: The scheme of the request URL. - `server.address`: The address that the server is listening on. - `server.port`: The port that the server is listening on. ##### `http.server.request.body.size` A histogram of the size of the request body of incoming HTTP requests served with `Deno.serve` or `Deno.serveHttp`. The unit of this metric is bytes. The histogram buckets are `[0, 100, 1000, 10000, 100000, 1000000, 10000000, 100000000, 1000000000]`. This metric is recorded with the following attributes: - `http.request.method`: The HTTP method of the request. - `url.scheme`: The scheme of the request URL. - `network.protocol.version`: The version of the HTTP protocol used for the request (e.g. `1.1` or `2`). - `server.address`: The address that the server is listening on. - `server.port`: The port that the server is listening on. - `http.response.status_code`: The status code of the response (if the request has been handled without a fatal error). - `error.type`: The type of error that occurred (if the request handling was subject to an error). ##### `http.server.response.body.size` A histogram of the size of the response body of incoming HTTP requests served with `Deno.serve` or `Deno.serveHttp`. The unit of this metric is bytes. The histogram buckets are `[0, 100, 1000, 10000, 100000, 1000000, 10000000, 100000000, 1000000000]`. This metric is recorded with the following attributes: - `http.request.method`: The HTTP method of the request. - `url.scheme`: The scheme of the request URL. - `network.protocol.version`: The version of the HTTP protocol used for the request (e.g. `1.1` or `2`). - `server.address`: The address that the server is listening on. - `server.port`: The port that the server is listening on. - `http.response.status_code`: The status code of the response (if the request has been handled without a fatal error). - `error.type`: The type of error that occurred (if the request handling was subject to an error). ### Logs The following logs are automatically collected and exported: - Any logs created with `console.*` methods such as `console.log` and `console.error`. - Any logs created by the Deno runtime, such as debug logs, `Downloading` logs, and similar. - Any errors that cause the Deno runtime to exit (both from user code, and from the runtime itself). Logs raised from JavaScript code will be exported with the relevant span context, if the log occurred inside of an active span. `console` auto instrumentation can be configured using the `OTEL_DENO_CONSOLE` environment variable: - `capture`: Logs are emitted to stdout/stderr and are also exported with OpenTelemetry. (default) - `replace`: Logs are only exported with OpenTelemetry, and not emitted to stdout/stderr. - `ignore`: Logs are emitted only to stdout/stderr, and will not be exported with OpenTelemetry. ## User metrics In addition to the automatically collected telemetry data, you can also create your own metrics and traces using the `npm:@opentelemetry/api` package. You do not need to configure the `npm:@opentelemetry/api` package to use it with Deno. Deno sets up the `npm:@opentelemetry/api` package automatically when the `--unstable-otel` flag is passed. There is no need to call `metrics.setGlobalMeterProvider()`, `trace.setGlobalTracerProvider()`, or `context.setGlobalContextManager()`. All configuration of resources, exporter settings, etc. is done via environment variables. Deno works with version `1.x` of the `npm:@opentelemetry/api` package. You can either import directly from `npm:@opentelemetry/api@1`, or you can install the package locally with `deno add` and import from `@opentelemetry/api`. ```sh deno add npm:@opentelemetry/api@1 ``` For both traces and metrics, you need to define names for the tracer and meter respectively. If you are instrumenting a library, you should name the tracer or meter after the library (such as `my-awesome-lib`). If you are instrumenting an application, you should name the tracer or meter after the application (such as `my-app`). The version of the tracer or meter should be set to the version of the library or application. ### Traces To create a new span, first import the `trace` object from `npm:@opentelemetry/api` and create a new tracer: ```ts import { trace } from "npm:@opentelemetry/api@1"; const tracer = trace.getTracer("my-app", "1.0.0"); ``` Then, create a new span using the `tracer.startActiveSpan` method and pass a callback function to it. You have to manually end the span by calling the `end` method on the span object returned by `startActiveSpan`. ```ts function myFunction() { return tracer.startActiveSpan("myFunction", (span) => { try { // do myFunction's work } catch (error) { span.recordException(error); span.setStatus({ code: trace.SpanStatusCode.ERROR, message: (error as Error).message, }); throw error; } finally { span.end(); } }); } ``` `span.end()` should be called in a `finally` block to ensure that the span is ended even if an error occurs. `span.recordException` and `span.setStatus` should also be called in a `catch` block, to record any errors that occur. Inside of the callback function, the created span is the "active span". You can get the active span using `trace.getActiveSpan()`. The "active span" will be used as the parent span for any spans created (manually, or automatically by the runtime) inside of the callback function (or any functions that are called from the callback function). [Learn more about context propagation](#context-propagation). The `startActiveSpan` method returns the return value of the callback function. Spans can have attributes added to them during their lifetime. Attributes are key value pairs that represent structured metadata about the span. Attributes can be added using the `setAttribute` and `setAttributes` methods on the span object. ```ts span.setAttribute("key", "value"); span.setAttributes({ success: true, "bar.count": 42n, "foo.duration": 123.45 }); ``` Values for attributes can be strings, numbers (floats), bigints (clamped to u64), booleans, or arrays of any of these types. If an attribute value is not one of these types, it will be ignored. The name of a span can be updated using the `updateName` method on the span object. ```ts span.updateName("new name"); ``` The status of a span can be set using the `setStatus` method on the span object. The `recordException` method can be used to record an exception that occurred during the span's lifetime. `recordException` creates an event with the exception stack trace and name and attaches it to the span. **`recordException` does not set the span status to `ERROR`, you must do that manually.** ```ts import { SpanStatusCode } from "npm:@opentelemetry/api@1"; span.setStatus({ code: SpanStatusCode.ERROR, message: "An error occurred", }); span.recordException(new Error("An error occurred")); // or span.setStatus({ code: SpanStatusCode.OK, }); ``` Spans can also have [events](https://open-telemetry.github.io/opentelemetry-js/interfaces/_opentelemetry_api.Span.html#addEvent) and [links](https://open-telemetry.github.io/opentelemetry-js/interfaces/_opentelemetry_api.Span.html#addLink) added to them. Events are points in time that are associated with the span. Links are references to other spans. ```ts // Add an event to the span span.addEvent("button_clicked", { id: "submit-button", action: "submit", }); // Add an event with a timestamp span.addEvent("process_completed", { status: "success" }, Date.now()); ``` Events can include optional attributes similar to spans. They are useful for marking significant moments within the span's lifetime without creating separate spans. Spans can also be created manually with `tracer.startSpan` which returns a span object. This method does not set the created span as the active span, so it will not automatically be used as the parent span for any spans created later, or any `console.log` calls. A span can manually be set as the active span for a callback, by using the [context propagation API](#context-propagation). Both `tracer.startActiveSpan` and `tracer.startSpan` can take an optional options bag containing any of the following properties: - `kind`: The kind of the span. Can be `SpanKind.CLIENT`, `SpanKind.SERVER`, `SpanKind.PRODUCER`, `SpanKind.CONSUMER`, or `SpanKind.INTERNAL`. Defaults to `SpanKind.INTERNAL`. - `startTime` A `Date` object representing the start time of the span, or a number representing the start time in milliseconds since the Unix epoch. If not provided, the current time will be used. - `attributes`: An object containing attributes to add to the span. - `links`: An array of links to add to the span. - `root`: A boolean indicating whether the span should be a root span. If `true`, the span will not have a parent span (even if there is an active span). After the options bag, both `tracer.startActiveSpan` and `tracer.startSpan` can also take a `context` object from the [context propagation API](#context-propagation). Learn more about the full tracing API in the [OpenTelemetry JS API docs](https://open-telemetry.github.io/opentelemetry-js/classes/_opentelemetry_api.TraceAPI.html). ### Metrics To create a metric, first import the `metrics` object from `npm:@opentelemetry/api` and create a new meter: ```ts import { metrics } from "npm:@opentelemetry/api@1"; const meter = metrics.getMeter("my-app", "1.0.0"); ``` Then, an instrument can be created from the meter, and used to record values: ```ts const counter = meter.createCounter("my_counter", { description: "A simple counter", unit: "1", }); counter.add(1); counter.add(2); ``` Each recording can also have associated attributes: ```ts counter.add(1, { color: "red" }); counter.add(2, { color: "blue" }); ``` :::tip In OpenTelemetry, metric attributes should generally have low cardinality. This means that there should not be too many unique combinations of attribute values. For example, it is probably fine to have an attribute for which continent a user is on, but it would be too high cardinality to have an attribute for the exact latitude and longitude of the user. High cardinality attributes can cause problems with metric storage and exporting, and should be avoided. Use spans and logs for high cardinality data. ::: There are several types of instruments that can be created with a meter: - **Counter**: A counter is a monotonically increasing value. Counters can only be positive. They can be used for values that are always increasing, such as the number of requests handled. - **UpDownCounter**: An up-down counter is a value that can both increase and decrease. Up-down counters can be used for values that can increase and decrease, such as the number of active connections or requests in progress. - **Gauge**: A gauge is a value that can be set to any value. They are used for values that do not "accumulate" over time, but rather have a specific value at any given time, such as the current temperature. - **Histogram**: A histogram is a value that is recorded as a distribution of values. Histograms can be used for values that are not just a single number, but a distribution of numbers, such as the response time of a request in milliseconds. Histograms can be used to calculate percentiles, averages, and other statistics. They have a predefined set of boundaries that define the buckets that the values are placed into. By default, the boundaries are `[0.0, 5.0, 10.0, 25.0, 50.0, 75.0, 100.0, 250.0, 500.0, 750.0, 1000.0, 2500.0, 5000.0, 7500.0, 10000.0]`. There are also several types of observable instruments. These instruments do not have a synchronous recording method, but instead return a callback that can be called to record a value. The callback will be called when the OpenTelemetry SDK is ready to record a value, for example just before exporting. ```ts const counter = meter.createObservableCounter("my_counter", { description: "A simple counter", unit: "1", }); counter.addCallback((res) => { res.observe(1); // or res.observe(1, { color: "red" }); }); ``` There are three types of observable instruments: - **ObservableCounter**: An observable counter is a counter that can be observed asynchronously. It can be used for values that are always increasing, such as the number of requests handled. - **ObservableUpDownCounter**: An observable up-down counter is a value that can both increase and decrease, and can be observed asynchronously. Up-down counters can be used for values that can increase and decrease, such as the number of active connections or requests in progress. - **ObservableGauge**: An observable gauge is a value that can be set to any value, and can be observed asynchronously. They are used for values that do not "accumulate" over time, but rather have a specific value at any given time, such as the current temperature. Learn more about the full metrics API in the [OpenTelemetry JS API docs](https://open-telemetry.github.io/opentelemetry-js/classes/_opentelemetry_api.MetricsAPI.html). ### Practical Examples For practical examples of implementing OpenTelemetry in Deno applications, see our tutorials: - [Basic OpenTelemetry Tutorial](/examples/basic_opentelemetry_tutorial/) - A simple HTTP server with custom metrics and traces - [Distributed Tracing Tutorial](/examples/otel_span_propagation_tutorial/) - Advanced techniques for tracing across service boundaries ## Context propagation In OpenTelemetry, context propagation is the process of passing some context information (such as the current span) from one part of an application to another, without having to pass it explicitly as an argument to every function. In Deno, context propagation is done using the rules of `AsyncContext`, the TC39 proposal for async context propagation. The `AsyncContext` API is not yet exposed to users in Deno, but it is used internally to propagate the active span and other context information across asynchronous boundaries. A quick overview how AsyncContext propagation works: - When a new asynchronous task is started (such as a promise, or a timer), the current context is saved. - Then some other code can execute concurrently with the asynchronous task, in a different context. - When the asynchronous task completes, the saved context is restored. This means that async context propagation essentially behaves like a global variable that is scoped to the current asynchronous task, and is automatically copied to any new asynchronous tasks that are started from this current task. The `context` API from `npm:@opentelemetry/api@1` exposes this functionality to users. It works as follows: ```ts import { context } from "npm:@opentelemetry/api@1"; // Get the currently active context const currentContext = context.active(); // You can add create a new context with a value added to it const newContext = currentContext.setValue("id", 1); // The current context is not changed by calling setValue console.log(currentContext.getValue("id")); // undefined // You can run a function inside a new context context.with(newContext, () => { // Any code in this block will run with the new context console.log(context.active().getValue("id")); // 1 // The context is also available in any functions called from this block function myFunction() { return context.active().getValue("id"); } console.log(myFunction()); // 1 // And it is also available in any asynchronous callbacks scheduled from here setTimeout(() => { console.log(context.active().getValue("id")); // 1 }, 10); }); // Outside, the context is still the same console.log(context.active().getValue("id")); // undefined ``` The context API integrates with spans too. For example, to run a function in the context of a specific span, the span can be added to a context, and then the function can be run in that context: ```ts import { context, trace } from "npm:@opentelemetry/api@1"; const tracer = trace.getTracer("my-app", "1.0.0"); const span = tracer.startSpan("myFunction"); const contextWithSpan = trace.setSpan(context.active(), span); context.with(contextWithSpan, () => { const activeSpan = trace.getActiveSpan(); console.log(activeSpan === span); // true }); // Don't forget to end the span! span.end(); ``` Learn more about the full context API in the [OpenTelemetry JS API docs](https://open-telemetry.github.io/opentelemetry-js/classes/_opentelemetry_api.ContextAPI.html). ## Configuration The OpenTelemetry integration can be enabled by setting the `OTEL_DENO=true` environment variable. The endpoint and protocol for the OTLP exporter can be configured using the `OTEL_EXPORTER_OTLP_ENDPOINT` and `OTEL_EXPORTER_OTLP_PROTOCOL` environment variables. If the endpoint requires authentication, headers can be configured using the `OTEL_EXPORTER_OTLP_HEADERS` environment variable. Endpoint can all be overridden individually for metrics, traces, and logs by using specific environment variables, such as: - `OTEL_EXPORTER_OTLP_METRICS_ENDPOINT` - `OTEL_EXPORTER_OTLP_TRACES_ENDPOINT` - `OTEL_EXPORTER_OTLP_LOGS_ENDPOINT` For more information on headers that can be used to configure the OTLP exporter, [see the OpenTelemetry website](https://opentelemetry.io/docs/specs/otel/protocol/exporter/#configuration-options). The resource that is associated with the telemetry data can be configured using the `OTEL_SERVICE_NAME` and `OTEL_RESOURCE_ATTRIBUTES` environment variables. In addition to attributes set via the `OTEL_RESOURCE_ATTRIBUTES` environment variable, the following attributes are automatically set: - `service.name`: If `OTEL_SERVICE_NAME` is not set, the value is set to ``. - `process.runtime.name`: `deno` - `process.runtime.version`: The version of the Deno runtime. - `telemetry.sdk.name`: `deno-opentelemetry` - `telemetry.sdk.language`: `deno-rust` - `telemetry.sdk.version`: The version of the Deno runtime, plus the version of the `opentelemetry` Rust crate being used by Deno, separated by a `-`. Propagators can be configured using the `OTEL_PROPAGATORS` environment variable. The default value is `tracecontext,baggage`. Multiple propagators can be specified by separating them with commas. Currently supported propagators are: - `tracecontext`: W3C Trace Context propagation format - `baggage`: W3C Baggage propagation format Metric collection frequency can be configured using the `OTEL_METRIC_EXPORT_INTERVAL` environment variable. The default value is `60000` milliseconds (60 seconds). Span exporter batching can be configured using the batch span processor environment variables described in the [OpenTelemetry specification](https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#batch-span-processor). Log exporter batching can be configured using the batch log record processor environment variables described in the [OpenTelemetry specification](https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#batch-log-record-processor). ## Propagators Deno supports context propagators which enable automatic propagation of trace context across process boundaries for distributed tracing, allowing you to track requests as they flow through different services. Propagators are responsible for encoding and decoding context information (like trace and span IDs) into and from carrier formats (like HTTP headers). This enables the trace context to be maintained across service boundaries. By default, Deno supports the following propagators: - `tracecontext`: The W3C Trace Context propagation format, which is the standard way to propagate trace context via HTTP headers. - `baggage`: The W3C Baggage propagation format, which allows passing key-value pairs across service boundaries. :::note These propagators automatically work with Deno's `fetch` API and `Deno.serve`, enabling end-to-end tracing across HTTP requests without manual context management. ::: You can access the propagation API through the `@opentelemetry/api` package: ```ts import { context, propagation, trace } from "npm:@opentelemetry/api@1"; // Extract context from incoming headers function extractContextFromHeaders(headers: Headers) { const ctx = context.active(); return propagation.extract(ctx, headers); } // Inject context into outgoing headers function injectContextIntoHeaders(headers: Headers) { const ctx = context.active(); propagation.inject(ctx, headers); return headers; } // Example: Making a fetch request that propagates trace context async function tracedFetch(url: string) { const headers = new Headers(); injectContextIntoHeaders(headers); return await fetch(url, { headers }); } ``` ## Limitations While the OpenTelemetry integration for Deno is in development, there are some limitations to be aware of: - Traces are always sampled (i.e. `OTEL_TRACE_SAMPLER=parentbased_always_on`). - Traces only support links with no attributes. - Metric exemplars are not supported. - Custom log streams (e.g. logs other than `console.log` and `console.error`) are not supported. - The only supported exporter is OTLP - other exporters are not supported. - Only `http/protobuf` and `http/json` protocols are supported for OTLP. Other protocols such as `grpc` are not supported. - Metrics from observable (asynchronous) meters are not collected on process exit/crash, so the last value of metrics may not be exported. Synchronous metrics are exported on process exit/crash. - The limits specified in the `OTEL_ATTRIBUTE_VALUE_LENGTH_LIMIT`, `OTEL_ATTRIBUTE_COUNT_LIMIT`, `OTEL_SPAN_EVENT_COUNT_LIMIT`, `OTEL_SPAN_LINK_COUNT_LIMIT`, `OTEL_EVENT_ATTRIBUTE_COUNT_LIMIT`, and `OTEL_LINK_ATTRIBUTE_COUNT_LIMIT` environment variable are not respected for trace spans. - The `OTEL_METRIC_EXPORT_TIMEOUT` environment variable is not respected. - HTTP methods are that are not known are not normalized to `_OTHER` in the `http.request.method` span attribute as per the OpenTelemetry semantic conventions. - The HTTP server span for `Deno.serve` does not have an OpenTelemetry status set, and if the handler throws (ie `onError` is invoked), the span will not have an error status set and the error will not be attached to the span via event. - There is no mechanism to add a `http.route` attribute to the HTTP client span for `fetch`, or to update the span name to include the route. --- # Security and permissions > A guide to Deno's security model and permissions system. Learn about secure defaults, permission flags, runtime prompts, and how to safely execute code with granular access controls. URL: https://docs.deno.com/runtime/fundamentals/security Deno is secure by default. Unless you specifically enable it, a program run with Deno has no access to sensitive APIs, such as file system access, network connectivity, or environment access. You must explicitly grant access to these resources with command line flags or with a runtime permission prompt. This is a major difference from Node, where dependencies are automatically granted full access to all system I/O, potentially introducing hidden vulnerabilities into your project. Before using Deno to run completely untrusted code, read the [section on executing untrusted code](#executing-untrusted-code) below. ## Key Principles Before diving into the specifics of permissions, it's important to understand the key principles of Deno's security model: - **No access to I/O by default**: Code executing in a Deno runtime has no access to read or write arbitrary files on the file system, to make network requests or open network listeners, to access environment variables, or to spawn subprocesses. - **No limits on the execution of code at the same privilege level**: Deno allows the execution of any code (JS/TS/Wasm) via multiple means, including `eval`, `new Function`, dynamic imports and web workers at the same privilege level with little restriction as to where the code originates (network, npm, JSR, etc). - **Multiple invocations of the same application can share data**: Deno provides a mechanism for multiple invocations of the same application to share data, through built in caching and KV storage APIs. Different applications can not see each other's data. - **All code executing on the same thread shares the same privilege level**: All code executing on the same thread shares the same privilege level. It is not possible for different modules to have different privilege levels within the same thread. - **Code can not escalate its privileges without user consent**: Code executing in a Deno runtime can not escalate its privileges without the user agreeing explicitly to an escalation via interactive prompt or a invocation time flag. - **The initial static module graph can import local files without restrictions**: All files that are imported in the initial static module graph can be imported without restrictions, so even if an explicit read permission is not granted for that file. This does not apply to any dynamic module imports. These key principles are designed to provide an environment where a user can execute code with minimal risk of harm to the host machine or network. The security model is designed to be simple to understand and to provide a clear separation of concerns between the runtime and the code executing within it. The security model is enforced by the Deno runtime, and is not dependent on the underlying operating system. ## Permissions By default, access to most system I/O is denied. There are some I/O operations that are allowed in a limited capacity, even by default. These are described below. To enable these operations, the user must explicitly grant permission to the Deno runtime. This is done by passing the `--allow-read`, `--allow-write`, `--allow-net`, `--allow-env`, and `--allow-run` flags to the `deno` command. During execution of a script, a user can also explicitly grant permission to specific files, directories, network addresses, environment variables, and subprocesses when prompted by the runtime. Prompts are not shown if stdout/stderr are not a TTY, or when the `--no-prompt` flag is passed to the `deno` command. Users can also explicitly disallow access to specific resources by using the `--deny-read`, `--deny-write`, `--deny-net`, `--deny-env`, and `--deny-run` flags. These flags take precedence over the allow flags. For example, if you allow network access but deny access to a specific domain, the deny flag will take precedence. Deno also provides a `--allow-all` flag that grants all permissions to the script. This **disables** the security sandbox entirely, and should be used with caution. The `--allow-all` has the same security properties as running a script in Node.js (ie none). Definition: `-A, --allow-all` ```sh deno run -A script.ts deno run --allow-all script.ts ``` By default, Deno will not generate a stack trace for permission requests as it comes with a hit to performance. Users can enable stack traces with the `DENO_TRACE_PERMISSIONS` environment variable. ### File system access By default, executing code can not read or write arbitrary files on the file system. This includes listing the contents of directories, checking for the existence of a given file, and opening or connecting to Unix sockets. Access to read files is granted using the `--allow-read` (or `-R`) flag, and access to write files is granted using the `--allow-write` (or `-W`) flag. These flags can be specified with a list of paths to allow access to specific files or directories and any subdirectories in them. Definition: `--allow-read[=...]` or `-R[=...]` ```sh # Allow all reads from file system deno run -R script.ts # or deno run --allow-read script.ts # Allow reads from file foo.txt and bar.txt only deno run --allow-read=foo.txt,bar.txt script.ts # Allow reads from any file in any subdirectory of ./node_modules deno run --allow-read=node_modules script.ts ``` Definition: `--deny-read[=...]` ```sh # Allow reading files in /etc but disallow reading /etc/hosts deno run --allow-read=/etc --deny-read=/etc/hosts script.ts # Deny all read access to disk, disabling permission prompts for reads. deno run --deny-read script.ts ``` Definition: `--allow-write[=...]` or `-W[=...]` ```sh # Allow all writes to file system deno run -W script.ts # or deno run --allow-write script.ts # Allow writes to file foo.txt and bar.txt only deno run --allow-write=foo.txt,bar.txt script.ts ``` Definition: `--deny-write[=...]` ```sh # Allow reading files in current working directory # but disallow writing to ./secrets directory. deno run --allow-write=./ --deny-write=./secrets script.ts # Deny all write access to disk, disabling permission prompts. deno run --deny-write script.ts ``` Some APIs in Deno are implemented using file system operations under the hood, even though they do not provide direct read/write access to specific files. These APIs read and write to disk but do not require any explicit read/write permissions. Some examples of these APIs are: - `localStorage` - Deno KV - `caches` - `Blob` Because these APIs are implemented using file system operations, users can use them to consume file system resources like storage space, even if they do not have direct access to the file system. During module loading, Deno can load files from disk. This sometimes requires explicit permissions, and sometimes is allowed by default: - All files that are imported from the entrypoint module in a way that they can be statically analyzed are allowed to be read by default. This includes static `import` statements and dynamic `import()` calls where the argument is a string literal that points to a specific file or a directory of files. The full list of files that are in this list can be printed using `deno info `. - Files that are dynamically imported in a way that can not be statically analyzed require runtime read permissions. - Files inside of a `node_modules/` directory are allowed to be read by default. When fetching modules from the network, or when transpiling code from TypeScript to JavaScript, Deno uses the file system as a cache. This means that file system resources like storage space can be consumed by Deno even if the user has not explicitly granted read/write permissions. ### Network access By default, executing code can not make network requests, open network listeners or perform DNS resolution. This includes making HTTP requests, opening TCP/UDP sockets, and listening for incoming connections on TCP or UDP. Network access is granted using the `--allow-net` flag. This flag can be specified with a list of IP addresses or hostnames to allow access to specific network addresses. Definition: `--allow-net[=...]` or `-N[=...]` ```sh # Allow network access deno run -N script.ts # or deno run --allow-net script.ts # Allow network access to github.com and jsr.io deno run --allow-net=github.com,jsr.io script.ts # A hostname at port 80: deno run --allow-net=example.com:80 script.ts # An IPv4 address on port 443 deno run --allow-net=1.1.1.1:443 script.ts # An IPv6 address, all ports allowed deno run --allow-net=[2606:4700:4700::1111] script.ts ``` Definition: `--deny-net[=...]` ```sh # Allow access to network, but deny access # to github.com and jsr.io deno run --allow-net --deny-net=github.com,jsr.io script.ts # Deny all network access, disabling permission prompts. deno run --deny-net script.ts ``` During module loading, Deno can load modules from the network. By default Deno allows loading modules from the following locations using both static and dynamic imports, without requiring explicit network access: - `https://deno.land/` - `https://jsr.io/` - `https://esm.sh/` - `https://raw.githubusercontent.com` - `https://gist.githubusercontent.com` These locations are trusted "public good" registries that are not expected to enable data exfiltration through URL paths. You can add more trusted registries using the `--allow-imports` flag. In addition Deno allows importing any NPM package through `npm:` specifiers. Deno also sends requests to `https://dl.deno.land/` at most once a day to check for updates to the Deno CLI. This can be disabled using `DENO_NO_UPDATE_CHECK=1` environment var. ### Environment variables By default, executing code can not read or write environment variables. This includes reading environment variables, and setting new values. Access to environment variables is granted using the `--allow-env` flag. This flag can be specified with a list of environment variables to allow access to specific environment variables. Starting with Deno v2.1, you can now specify suffix wildcards to allow “scoped” access to environmental variables. Definition: `--allow-env[=...]` or `-E[=...]` ```sh # Allow access to all environment variables deno run -E script.ts # or deno run --allow-env script.ts # Allow HOME and FOO environment variables deno run --allow-env=HOME,FOO script.ts # Allow access to all environment variables starting with AWS_ deno run --allow-env="AWS_*" script.ts ``` Definition: `--deny-env[=...]` ```sh # Allow all environment variables except # AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. deno run \ --allow-env \ --deny-env=AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY \ script.ts # Deny all access to env variables, disabling permission prompts. deno run --deny-env script.ts ``` > Note for Windows users: environment variables are case insensitive on Windows, > so Deno also matches them case insensitively (on Windows only). Deno reads certain environment variables on startup, such as `DENO_DIR` and `NO_COLOR` ([see the full list](/runtime/reference/cli/env_variables/)). The value of the `NO_COLOR` environment variable is visible to all code running in the Deno runtime, regardless of whether the code has been granted permission to read environment variables. ### System Information By default, executing code can not access system information, such as the operating system release, system uptime, load average, network interfaces, and system memory information. Access to system information is granted using the `--allow-sys` flag. This flag can be specified with a list of allowed interfaces from the following list: `hostname`, `osRelease`, `osUptime`, `loadavg`, `networkInterfaces`, `systemMemoryInfo`, `uid`, and `gid`. These strings map to functions in the `Deno` namespace that provide OS info, like [Deno.systemMemoryInfo](https://docs.deno.com/api/deno/~/Deno.SystemMemoryInfo). Definition: `--allow-sys[=...]` or `-S[=...]` ```sh # Allow all system information APIs deno run -S script.ts # or deno run --allow-sys script.ts # Allow systemMemoryInfo and osRelease APIs deno run --allow-sys="systemMemoryInfo,osRelease" script.ts ``` Definition: `--deny-sys[=...]` ```sh # Allow accessing all system information but "networkInterfaces" deno run --allow-sys --deny-sys="networkInterfaces" script.ts # Deny all access to system information, disabling permission prompts. deno run --deny-sys script.ts ``` ### Subprocesses Code executing inside of a Deno runtime can not spawn subprocesses by default, as this would constitute a violation of the principle that code can not escalate its privileges without user consent. Deno provides a mechanism for executing subprocesses, but this requires explicit permission from the user. This is done using the `--allow-run` flag. Any subprocesses you spawn from your program run independently from the permissions granted to the parent process. This means the child processes can access system resources regardless of the permissions you granted to the Deno process that spawned it. This is often referred to as privilege escalation. Because of this, make sure you carefully consider if you want to grant a program `--allow-run` access: it essentially invalidates the Deno security sandbox. If you really need to spawn a specific executable, you can reduce the risk by limiting which programs a Deno process can start by passing specific executable names to the `--allow-run` flag. Definition: `--allow-run[=...]` ```sh # Allow running all subprocesses deno run --allow-run script.ts # Allow running "curl" and "whoami" subprocesses deno run --allow-run="curl,whoami" script.ts ``` :::caution You probably don't ever want to use `--allow-run=deno` unless the parent process has `--allow-all`, as being able to spawn a `deno` process means the script can spawn another `deno` process with full permissions. ::: Definition: `--deny-run[=...]` ```sh # Allow running running all programs, but "whoami" and "ps". deno run --allow-run --deny-run="whoami,ps" script.ts # Deny all access for spawning subprocessing, disabling # permission prompts. deno run --deny-run script.ts ``` By default `npm` packages will not have their post-install scripts executed during installation (like with `deno install`), as this would allow arbitrary code execution. When running with the `--allow-scripts` flag, post-install scripts for npm packages will be executed as a subprocess. ### FFI (Foreign Function Interface) Deno provides an [FFI mechanism for executing code written in other languages](/runtime/fundamentals/ffi/), such as Rust, C, or C++, from within a Deno runtime. This is done using the `Deno.dlopen` API, which can load shared libraries and call functions from them. By default, executing code can not use the `Deno.dlopen` API, as this would constitute a violation of the principle that code can not escalate it's privileges without user consent. In addition to `Deno.dlopen`, FFI can also be used via Node-API (NAPI) native addons. These are also not allowed by default. Both `Deno.dlopen` and NAPI native addons require explicit permission using the `--allow-ffi` flag. This flag can be specified with a list of files or directories to allow access to specific dynamic libraries. _Like subprocesses, dynamic libraries are not run in a sandbox and therefore do not have the same security restrictions as the Deno process they are being loaded into. Therefore, use with extreme caution._ Definition: `--allow-ffi[=...]` ```sh # Allow loading dynamic all libraries deno run --allow-ffi script.ts # Allow loading dynamic libraries from a specific path deno run --allow-ffi=./libfoo.so script.ts ``` Definition: `--deny-ffi[=...]` ```sh # Allow loading all dynamic libraries, but ./libfoo.so deno run --allow-ffi --deny-ffi=./libfoo.so script.ts # Deny loading all dynamic libraries, disabling permission prompts. deno run --deny-ffi script.ts ``` ### Importing from the Web Allow importing code from the Web. By default Deno limits hosts you can import code from. This is true for both static and dynamic imports. If you want to dynamically import code, either using the `import()` or the `new Worker()` APIs, additional permissions need to be granted. Importing from the local file system [requires `--allow-read`](#file-system-read-access), but Deno also allows to import from `http:` and `https:` URLs. In such case you will need to specify an explicit `--allow-import` flag: ``` # allow importing code from `https://example.com` $ deno run --allow-import=example.com main.ts ``` By default Deno allows importing sources from following hosts: - `deno.land` - `esm.sh` - `jsr.io` - `cdn.jsdelivr.net` - `raw.githubusercontent.com` - `gist.githubusercontent.com` **Imports are only allowed using HTTPS** This allow list is applied by default for static imports, and by default to dynamic imports if the `--allow-import` flag is specified. ``` # allow dynamically importing code from `https://deno.land` $ deno run --allow-import main.ts ``` Note that specifying an allow list for `--allow-import` will override the list of default hosts. ## Evaluation of code Deno sets no limits on the execution of code at the same privilege level. This means that code executing in a Deno runtime can use `eval`, `new Function`, or even dynamic import or web workers to execute **arbitrary** code with the same privilege level as the code that called `eval`, `new Function`, or the dynamic import or web worker. This code can be hosted on the network, be in a local file (if read permissions are granted), or be stored as plain text in a string inside of the code that called `eval`, `new Function`, or the dynamic import or web worker. ## Executing untrusted code While Deno provides security features that are designed to protect the host machine and network from harm, untrusted code is still scary. When executing untrusted code, it is important to have more than one layer of defense. Some suggestions for executing untrusted code are outlined below, and we recommend using all of these when executing arbitrary untrusted code: - Run `deno` with limited permissions and determine upfront what code actually needs to run (and prevent more code being loaded using `--frozen` lockfile and `--cached-only`). - Use OS provided sandboxing mechanisms like `chroot`, `cgroups`, `seccomp`, etc. - Use a sandboxed environment like a VM or MicroVM (gVisor, Firecracker, etc). --- # Stability and releases > Guide to Deno's stability guarantees and release process. Covering release channels, long-term support (LTS), unstable features, versioning policy, and how Deno maintains backward compatibility. URL: https://docs.deno.com/runtime/fundamentals/stability_and_releases As of Deno 1.0.0, the `Deno` namespace APIs are stable. That means we will strive to make code working under 1.0.0 continue to work in future versions. ## Release schedule, channels, and long term support Deno releases a new stable, minor version (eg. v2.1.0, v2.0.0) on a 12 week schedule. Patch releases including bug fixes for the latest minor version are released as needed - you can expect several patch releases before a new minor version is released. ### Release channels Deno offers 4 release channels - `stable` - a semver minor/patch release, as described above. This is **the default** distribution channel that is recommended for most users. - `lts` - long term support for a particular stable release, recommended for enterprise users who prefer not to upgrade so often. See below for details. - `rc` - a release candidate for the upcoming semver minor release. - `canary` - an unstable release that changes multiple times per day, allows to try out latest bug fixes and new features that might end up in the `stable` channel. ### Long Term Support (LTS) Starting with Deno v2.1.0 (released in November 2024), Deno offers an LTS (long-term support) channel. An LTS channel is a minor semver version that we maintain with only backwards-compatible bug fixes. | LTS release version | LTS maintenance start | LTS maintenance end | | ------------------- | --------------------- | ------------------- | | v2.1 | Feb 1st, 2025 | Apr 30th, 2025 | | v2.2 | May 1st, 2025 | Oct 31st, 2025 | | v2.4 | Nov 1st, 2025 | Apr 30th, 2026 | We are initially keeping the LTS support window short while we refine the process. **LTS releases occur every six months**, with patch releases as needed for bug fixes. We plan to extend this support window to one year in the future. LTS backports include: - Security patches - Critical bug fixes (e.g., crashes, incorrect computations) - **Critical** performance improvements _may_ be backported based on severity. **API changes and major new features will not be backported.** ## Unstable APIs When introducing new APIs, these are first marked as unstable. This means that the API may change in the future. These APIs are not available to use unless you explicitly pass an unstable flag, like `--unstable-kv`. [Learn more about `--unstable-*` flags](/runtime/reference/cli/unstable_flags). There are also some non-runtime features of Deno that are considered unstable, and are locked behind unstable flags. For example, the `--unstable-sloppy-imports` flag is used to enable `import`ing code without specifying file extensions. ## Standard library The Deno Standard Library (https://jsr.io/@std) is mostly stable. All standard library modules that are version 1.0.0 or higher are considered stable. All other modules (0.x) are considered unstable, and may change in the future. Using unstable standard library modules is not recommended for production code, but it is a great way to experiment with new features and provide feedback to the Deno team. It is not necessary to use any unstable flags to use unstable standard library modules. --- # Standard Library > An introduction to Deno's Standard Library. Learn about TypeScript-first modules, cross-platform compatibility, versioning, package management, and how to use standard modules in your Deno projects. URL: https://docs.deno.com/runtime/fundamentals/standard_library Deno provides a standard library written in TypeScript. It is a set of standard modules that can be reused by programs, allowing you to focus on your application logic rather than "reinventing the wheel" for common tasks. All of the modules in the Deno Standard Library are audited by the core team and are guaranteed to work with Deno, ensuring consistency and reliability. See all packages on Many packages in the Deno Standard Library are also compatible with Node.js, Cloudflare Workers, and other JavaScript environments. This allows you to write code that can be run in multiple environments without modification. The standard library is hosted on JSR and is available at: [https://jsr.io/@std](https://jsr.io/@std). Packages are documented, tested, and include usage examples. You can browse the full list of standard library packages on JSR, but here are a few examples: - [@std/path](https://jsr.io/@std/path): Path manipulation utilities, akin to Node.js's `path` module. - [@std/jsonc](https://jsr.io/@std/jsonc): (De)serialization of JSON with comments - [@std/encoding](https://jsr.io/@std/encoding): Utilities for encoding and decoding common formats like hex, base64, and variant ## Versioning and stability Each package of the standard library is independently versioned. Packages follow [semantic versioning rules](https://jsr.io/@std/semver). You can use [version pinning or version ranges](/runtime/fundamentals/modules/#package-versions) to prevent major releases from affecting your code. ## Importing standard library modules To install packages from the Deno Standard Library, you can use the `deno add` subcommand to add the package to your `deno.json` import map. ```sh deno add jsr:@std/fs jsr:@std/path ``` The `deno.json` `imports` field will be updated to include those imports: ```json { "imports": { "@std/fs": "jsr:@std/fs@^1.0.2", "@std/path": "jsr:@std/path@^1.0.3" } } ``` You can then import these packages in your source code: ```ts import { copy } from "@std/fs"; import { join } from "@std/path"; await copy("foo.txt", join("dist", "foo.txt")); ``` Alternatively, you can import modules directly with the `jsr:` specifier: ```js import { copy } from "jsr:@std/fs@^1.0.2"; import { join } from "jsr:@std/path@^1.0.3"; await copy("foo.txt", join("dist", "foo.txt")); ``` ## Node.js compatibility The Deno Standard Library is designed to be compatible with Node.js, Cloudflare Workers, and other JavaScript environments. The standard library is written in TypeScript and compiled to JavaScript, so it can be used in any JavaScript environment. ```sh npx jsr add @std/fs @std/path ``` Running this command will add those packages to your `package.json`: ```json { "dependencies": { "@std/fs": "npm:@jsr/std__fs@^1.0.2", "@std/path": "npm:@jsr/std__path@^1.0.3" } } ``` Then you can import them in your source code, just like you would with any other Node.js package. TypeScript will automatically find the type definitions for these packages. ```ts import { copy } from "@std/fs"; import { join } from "@std/path"; await copy("foo.txt", join("dist", "foo.txt")); ``` --- # Testing > A guide to Deno's testing capabilities. Learn about the built-in test runner, assertions, mocking, coverage reporting, snapshot testing, and how to write effective tests for your Deno applications. URL: https://docs.deno.com/runtime/fundamentals/testing Deno provides a built-in test runner for writing and running tests in both JavaScript and TypeScript. This makes it easy to ensure your code is reliable and functions as expected without needing to install any additional dependencies or tools. The `deno test` runner allows you fine-grained control over permissions for each test, ensuring that code does not do anything unexpected. In addition to the built-in test runner, you can also use other test runners from the JS ecosystem, such as Jest, Mocha, or AVA, with Deno. We will not cover these in this document however. ## Writing Tests To define a test in Deno, you use the `Deno.test()` function. Here are some examples: ```ts title="my_test.ts" import { assertEquals } from "jsr:@std/assert"; Deno.test("simple test", () => { const x = 1 + 2; assertEquals(x, 3); }); import { delay } from "jsr:@std/async"; Deno.test("async test", async () => { const x = 1 + 2; await delay(100); assertEquals(x, 3); }); Deno.test({ name: "read file test", fn: () => { const data = Deno.readTextFileSync("./somefile.txt"); assertEquals(data, "expected content"); }, }); ``` If you prefer a "jest-like" `expect` style of assertions, the Deno standard library provides an [`expect`](https://jsr.io/@std/expect) function that can be used in place of `assertEquals`: ```ts title="my_test.ts" import { expect } from "jsr:@std/expect"; import { add } from "./add.js"; Deno.test("add function adds two numbers correctly", () => { const result = add(2, 3); expect(result).toBe(5); }); ``` ## Running Tests To run your tests, use the [`deno test`](/runtime/reference/cli/test/) subcommand. If run without a file name or directory name, this subcommand will automatically find and execute all tests in the current directory (recursively) that match the glob `{*_,*.,}test.{ts, tsx, mts, js, mjs, jsx}`. ```sh # Run all tests in the current directory and all sub-directories deno test # Run all tests in the util directory deno test util/ # Run just my_test.ts deno test my_test.ts # Run test modules in parallel deno test --parallel # Pass additional arguments to the test file that are visible in `Deno.args` deno test my_test.ts -- -e --foo --bar # Provide permission for deno to read from the filesystem, which is necessary # for the final test above to pass deno test --allow-read=. my_test.ts ``` ## Test Steps Deno also supports test steps, which allow you to break down tests into smaller, manageable parts. This is useful for setup and teardown operations within a test: ```ts Deno.test("database operations", async (t) => { using db = await openDatabase(); await t.step("insert user", async () => { // Insert user logic }); await t.step("insert book", async () => { // Insert book logic }); }); ``` ## Command line filtering Deno allows you to run specific tests or groups of tests using the `--filter` option on the command line. This option accepts either a string or a pattern to match test names. Filtering does not affect steps; if a test name matches the filter, all of its steps are executed. Consider the following tests: ```ts Deno.test("my-test", () => {}); Deno.test("test-1", () => {}); Deno.test("test-2", () => {}); ``` ### Filtering by string To run all tests that contain the word "my" in their names, use: ```sh deno test --filter "my" tests/ ``` This command will execute `my-test` because it contains the word "my". ### Filtering by Pattern To run tests that match a specific pattern, use: ```sh deno test --filter "/test-*\d/" tests/ ``` This command will run `test-1` and `test-2` because they match the pattern `test-*` followed by a digit. To indicate that you are using a pattern (regular expression), wrap your filter value with forward slashes `/`, much like JavaScript’s syntax for regular expressions. ### Including and excluding test files in the configuration file You can also filter tests by specifying paths to include or exclude in the [Deno configuration file](/runtime/fundamentals/configuration). For example, if you want to only test `src/fetch_test.ts` and `src/signal_test.ts` and exclude everything in `out/`: ```json { "test": { "include": [ "src/fetch_test.ts", "src/signal_test.ts" ] } } ``` Or more likely: ```json { "test": { "exclude": ["out/"] } } ``` ## Test definition selection Deno provides two options for selecting tests within the test definitions themselves: ignoring tests and focusing on specific tests. ### Ignoring/Skipping Tests You can ignore certain tests based on specific conditions using the `ignore` boolean in the test definition. If `ignore` is set to `true`, the test will be skipped. This is useful, for example, if you only want a test to run on a specific operating system. ```ts Deno.test({ name: "do macOS feature", ignore: Deno.build.os !== "darwin", // This test will be ignored if not running on macOS fn() { // do MacOS feature here }, }); ``` If you want to ignore a test without passing any conditions, you can use the `ignore()` function from the `Deno.test` object: ```ts Deno.test.ignore("my test", () => { // your test code }); ``` ### Only Run Specific Tests If you want to focus on a particular test and ignore the rest, you can use the `only` option. This tells the test runner to run only the tests with `only` set to true. Multiple tests can have this option set. However, if any test is flagged with only, the overall test run will always fail, as this is intended to be a temporary measure for debugging. ```ts Deno.test.only("my test", () => { // some test code }); ``` or ```ts Deno.test({ name: "Focus on this test only", only: true, // Only this test will run fn() { // test complicated stuff here }, }); ``` ## Failing fast If you have a long-running test suite and wish for it to stop on the first failure, you can specify the `--fail-fast` flag when running the suite. ```shell deno test --fail-fast ``` This will cause the test runner to stop execution after the first test failure. ## Reporters Deno includes three built-in reporters to format test output: - `pretty` (default): Provides a detailed and readable output. - `dot`: Offers a concise output, useful for quickly seeing test results. - `junit`: Produces output in JUnit XML format, which is useful for integrating with CI/CD tools. You can specify which reporter to use with the --reporter flag: ```sh # Use the default pretty reporter deno test # Use the dot reporter for concise output deno test --reporter=dot # Use the JUnit reporter deno test --reporter=junit ``` Additionally, you can write the JUnit report to a file while still getting human-readable output in the terminal by using the `--junit-path` flag: ```sh deno test --junit-path=./report.xml ``` ## Spying, mocking (test doubles), stubbing and faking time The [Deno Standard Library](/runtime/fundamentals/standard_library/) provides a set of functions to help you write tests that involve spying, mocking, and stubbing. Check out the [@std/testing documentation on JSR](https://jsr.io/@std/testing) for more information on each of these utilities or our [tutorial on mocking and spying in tests with deno](/examples/mocking_tutorial/). ## Coverage Deno will collect test coverage into a directory for your code if you specify the `--coverage` flag when starting `deno test`. This coverage information is acquired directly from the V8 JavaScript engine, ensuring high accuracy. This can then be further processed from the internal format into well known formats like `lcov` with the [`deno coverage`](/runtime/reference/cli/coverage/) tool. ## Behavior-Driven Development With the [@std/testing/bdd](https://jsr.io/@std/testing/doc/bdd/~) module you can write your tests in a familiar format for grouping tests and adding setup/teardown hooks used by other JavaScript testing frameworks like Jasmine, Jest, and Mocha. The `describe` function creates a block that groups together several related tests. The `it` function registers an individual test case. For example: ```ts import { describe, it } from "jsr:@std/testing/bdd"; import { expect } from "jsr:@std/expect"; import { add } from "./add.js"; describe("add function", () => { it("adds two numbers correctly", () => { const result = add(2, 3); expect(result).toBe(5); }); it("handles negative numbers", () => { const result = add(-2, -3); expect(result).toBe(-5); }); }); ``` Check out the [documentation on JSR](https://jsr.io/@std/testing/doc/bdd/~) for more information on these functions and hooks. - [BDD testing tutorial](/examples/bdd_tutorial/) ## Documentation Tests Deno allows you to evaluate code snippets written in JSDoc or markdown files. This ensures the examples in your documentation are up-to-date and functional. ### Example code blocks ````ts title="example.ts" /** * # Examples * * ```ts * import { assertEquals } from "jsr:@std/assert/equals"; * * const sum = add(1, 2); * assertEquals(sum, 3); * ``` */ export function add(a: number, b: number): number { return a + b; } ```` The triple backticks mark the start and end of code blocks, the language is determined by the language identifier attribute which may be one of the following: - `js` - `javascript` - `mjs` - `cjs` - `jsx` - `ts` - `typescript` - `mts` - `cts` - `tsx` If no language identifier is specified then the language is inferred from media type of the source document that the code block is extracted from. ```sh deno test --doc example.ts ``` The above command will extract this example, turn it into a pseudo test case that looks like below: ```ts title="example.ts$4-10.ts" ignore import { assertEquals } from "jsr:@std/assert/equals"; import { add } from "file:///path/to/example.ts"; Deno.test("example.ts$4-10.ts", async () => { const sum = add(1, 2); assertEquals(sum, 3); }); ``` and then run it as a standalone module living in the same directory as the module being documented. :::tip Want to type-check only? If you want to type-check your code snippets in JSDoc and markdown files without actually running them, you can use [`deno check`](/runtime/reference/cli/check/) command with `--doc` option (for JSDoc) or with `--doc-only` option (for markdown) instead. ::: ### Exported items are automatically imported Looking at the generated test code above, you will notice that it includes the `import` statement to import the `add` function even though the original code block does not have it. When documenting a module, any items exported from the module are automatically included in the generated test code using the same name. Let's say we have the following module: ````ts title="example.ts" /** * # Examples * * ```ts * import { assertEquals } from "jsr:@std/assert/equals"; * * const sum = add(ONE, getTwo()); * assertEquals(sum, 3); * ``` */ export function add(a: number, b: number): number { return a + b; } export const ONE = 1; export default function getTwo() { return 2; } ```` This will get converted to the following test case: ```ts title="example.ts$4-10.ts" ignore import { assertEquals } from "jsr:@std/assert/equals"; import { add, ONE }, getTwo from "file:///path/to/example.ts"; Deno.test("example.ts$4-10.ts", async () => { const sum = add(ONE, getTwo()); assertEquals(sum, 3); }); ``` ### Skipping code blocks You can skip the evaluation of code blocks by adding the `ignore` attribute. ````ts /** * This code block will not be run. * * ```ts ignore * await sendEmail("deno@example.com"); * ``` */ export async function sendEmail(to: string) { // send an email to the given address... } ```` ## Sanitizers The test runner offers several sanitizers to ensure that the test behaves in a reasonable and expected way. ### Resource sanitizer The resource sanitizer ensures that all I/O resources created during a test are closed, to prevent leaks. I/O resources are things like `Deno.FsFile` handles, network connections, `fetch` bodies, timers, and other resources that are not automatically garbage collected. You should always close resources when you are done with them. For example, to close a file: ```ts const file = await Deno.open("hello.txt"); // Do something with the file file.close(); // <- Always close the file when you are done with it ``` To close a network connection: ```ts const conn = await Deno.connect({ hostname: "example.com", port: 80 }); // Do something with the connection conn.close(); // <- Always close the connection when you are done with it ``` To close a `fetch` body: ```ts const response = await fetch("https://example.com"); // Do something with the response await response.body?.cancel(); // <- Always cancel the body when you are done with it, if you didn't consume it otherwise ``` This sanitizer is enabled by default, but can be disabled in this test with `sanitizeResources: false`: ```ts Deno.test({ name: "leaky resource test", async fn() { await Deno.open("hello.txt"); }, sanitizeResources: false, }); ``` ### Async operation sanitizer The async operation sanitizer ensures that all async operations started in a test are completed before the test ends. This is important because if an async operation is not awaited, the test will end before the operation is completed, and the test will be marked as successful even if the operation may have actually failed. You should always await all async operations in your tests. For example: ```ts Deno.test({ name: "async operation test", async fn() { await new Promise((resolve) => setTimeout(resolve, 1000)); }, }); ``` This sanitizer is enabled by default, but can be disabled with `sanitizeOps: false`: ```ts Deno.test({ name: "leaky operation test", fn() { crypto.subtle.digest( "SHA-256", new TextEncoder().encode("a".repeat(100000000)), ); }, sanitizeOps: false, }); ``` ### Exit sanitizer The exit sanitizer ensures that tested code doesn’t call `Deno.exit()`, which could signal a false test success. This sanitizer is enabled by default, but can be disabled with `sanitizeExit: false`. ```ts Deno.test({ name: "false success", fn() { Deno.exit(0); }, sanitizeExit: false, }); // This test never runs, because the process exits during "false success" test Deno.test({ name: "failing test", fn() { throw new Error("this test fails"); }, }); ``` ## Snapshot testing The [Deno Standard Library](/runtime/fundamentals/standard_library/) includes a [snapshot module](https://jsr.io/@std/testing/doc/snapshot/~) that allows developers to write tests by comparing values against reference snapshots. These snapshots are serialized representations of the original values and are stored alongside the test files. Snapshot testing enables catching a wide array of bugs with very little code. It is particularly helpful in situations where it is difficult to precisely express what should be asserted, without requiring a prohibitive amount of code, or where the assertions a test makes are expected to change often. - [Snapshot testing tutorial](/examples/snapshot_tutorial/) ## Tests and Permissions The `permissions` property in the `Deno.test` configuration allows you to specifically deny permissions, but does not grant them. Permissions must be provided when running the test command. When building robust applications, you often need to handle cases where permissions are denied, (for example you may want to write tests to check whether fallbacks have been set up correctly). Consider a situation where you are reading from a file, you may want to offer a fallback value in the case that the function does not have read permission: ```ts import { assertEquals } from "jsr:@std/assert"; import getFileText from "./main.ts"; Deno.test({ name: "File reader gets text with permission", // no `permissions` means "inherit" fn: async () => { const result = await getFileText(); console.log(result); assertEquals(result, "the content of the file"); }, }); Deno.test({ name: "File reader falls back to error message without permission", permissions: { read: false }, fn: async () => { const result = await getFileText(); console.log(result); assertEquals(result, "oops don't have permission"); }, }); ``` ```sh # Run the tests with read permission deno test --allow-read ``` The permissions object supports detailed configuration: ```ts Deno.test({ name: "permission configuration example", // permissions: { read: true } // Grant all read permissions and deny all others // OR permissions: { read: ["./data", "./config"], // Grant read to specific paths only write: false, // Explicitly deny write permissions net: ["example.com:443"], // Allow specific host:port combinations env: ["API_KEY"], // Allow access to specific env variables run: false, // Deny subprocess execution ffi: false, // Deny loading dynamic libraries hrtime: false, // Deny high-resolution time }, fn() { // Test code that respects these permission boundaries }, }); ``` Remember that any permission not explicitly granted at the command line will be denied, regardless of what's specified in the test configuration. --- # TypeScript support > Learn how to use TypeScript with Deno. Covers configuration options, type checking, and best practices for writing type-safe Deno applications. URL: https://docs.deno.com/runtime/fundamentals/typescript TypeScript is a first class language in Deno, just like JavaScript or WebAssembly. You can run or import TypeScript without installing anything more than the Deno CLI. With its built-in TypeScript compiler, Deno will compile your TypeScript code to JavaScript with no extra config needed. Deno can also type check your TypeScript code, without requiring a separate type checking tool like `tsc`. ## Type Checking One of the main advantages of TypeScript is that it can make your code type safe, catching errors during development rather than runtime. TypeScript is a superset of JavaScript meaning that syntactically valid JavaScript becomes TypeScript with warnings about being "unsafe". :::note **Deno type checks TypeScript in `strict mode` by default**, the TypeScript core team [recommends strict mode as a sensible default](https://www.typescriptlang.org/play/?#example/new-compiler-defaults). ::: Deno allows you to type-check your code (without executing it) with the [`deno check`](/runtime/reference/cli/check/) subcommand: ```shell # Check the current directory/module deno check # Check a specific TypeScript file deno check module.ts # Include remote modules and npm packages in the check deno check --all module.ts # Check code snippets in JSDoc comments deno check --doc module.ts # Check code snippets in markdown files deno check --doc-only markdown.md ``` :::note Type checking can take a significant amount of time, especially if you are working on a codebase where you are making a lot of changes. Deno optimizes type checking, but it still comes at a cost. Therefore, **by default, TypeScript modules are not type-checked before they are executed**. ::: When using the `deno run` command, Deno will skip type-checking and run the code directly. In order to perform a type check of the module before execution occurs, you can use the `--check` flag with `deno run`: ```shell deno run --check module.ts # or also type check remote modules and npm packages deno run --check=all module.ts ``` When Deno encounters a type error when using this flag, the process will exit before executing the code. In order to avoid this, you will either need to: - resolve the issue - use the `// @ts-ignore` or `// @ts-expect-error` pragmas to ignore the error - or skip type checking all together. When testing your code, type checking is enabled by default. You can use the `--no-check` flag to skip type checking if preferred: ```shell deno test --no-check ``` ## Using with JavaScript Deno runs JavaScript and TypeScript code. During type checking, Deno will only type check TypeScript files by default though. If you want to type check JavaScript files too, you can either add a `// @ts-check` pragma at the top of the file, or add `compilerOptions.checkJs` to your `deno.json` file. ```ts title="main.js" // @ts-check let x = "hello"; x = 42; // Type 'number' is not assignable to type 'string'. ``` ```json title="deno.json" { "compilerOptions": { "checkJs": true } } ``` In JavaScript files, you can not use TypeScript syntax like type annotations or importing types. You can use [TSDoc](https://tsdoc.org/) comments to provide type information to the TypeScript compiler though. ```ts title="main.js" // @ts-check /** * @param a {number} * @param b {number} * @returns {number} */ function add(a, b) { return a + b; } ``` ## Providing declaration files When importing untyped JavaScript modules from TypeScript code, you may need to provide type information for the JavaScript module. This is not necessary if the JavaScript is annotated with TSDoc comments. Without this additional type information (in the form of a `.d.ts` declaration file), TypeScript will assume everything exported from the JavaScript module is of type `any`. `tsc` will pick up `d.ts` files that are siblings of a `js` file and have the same basename, automatically. **Deno does not do this.** You must explicitly specify either in the `.js` file (the source), or the `.ts` file (the importer) where to find the `.d.ts` file. ### Providing types in the source One should prefer specifying the `.d.ts` file in the `.js` file, as this makes it easier to use the JavaScript module from multiple TypeScript modules: you won't have to specify the `.d.ts` file in every TypeScript module that imports the JavaScript module. ```ts title="add.js" // @ts-self-types="./add.d.ts" export function add(a, b) { return a + b; } ``` ```ts title="add.d.ts" export function add(a: number, b: number): number; ``` ### Providing types in the importer If you can't modify the JavaScript source, you can specify the `.d.ts` file in the TypeScript module that imports the JavaScript module. ```ts title="main.ts" // @ts-types="./add.d.ts" import { add } from "./add.js"; ``` This is also useful for NPM packages that don't provide type information: ```ts title="main.ts" // @ts-types="npm:@types/lodash" import * as _ from "npm:lodash"; ``` ### Providing types for HTTP modules Servers that host JavaScript modules via HTTP can also provide type information for those modules in a HTTP header. Deno will use this information when type-checking the module. ```http HTTP/1.1 200 OK Content-Type: application/javascript; charset=UTF-8 Content-Length: 648 X-TypeScript-Types: ./add.d.ts ``` The `X-TypeScript-Types` header specifies the location of the `.d.ts` file that provides type information for the JavaScript module. It is resolved relative to the URL of the JavaScript module, just like `Location` headers. ## Type checking for browsers and web workers By default, Deno type checks TypeScript modules as if they were running in the main thread of the Deno runtime. However, Deno also supports type checking for browsers, type checking for web workers, and type checking for combination browser-Deno environments like when using SSR (Server Side Rendering) with Deno. These environments have different global objects and APIs available to them. Deno provides type definitions for these environments in the form of library files. These library files are used by the TypeScript compiler to provide type information for the global objects and APIs available in these environments. The loaded library files can be changed using the `compilerOptions.lib` option in a `deno.json` configuration file, or through `/// ` comments in your TypeScript files. It is recommended to use the `compilerOptions.lib` option in the `deno.json` configuration file to specify the library files to use. To enable type checking for a **browser environment**, you can specify the `dom` library file in the `compilerOptions.lib` option in a `deno.json` configuration file: ```json title="deno.json" { "compilerOptions": { "lib": ["dom"] } } ``` This will enable type checking for a browser environment, providing type information for global objects like `document`. This will however disable type information for Deno-specific APIs like `Deno.readFile`. To enable type checking for combined **browser and Deno environments**, like using SSR with Deno, you can specify both the `dom` and `deno.ns` (Deno namespace) library files in the `compilerOptions.lib` option in a `deno.json` configuration file: ```json title="deno.json" { "compilerOptions": { "lib": ["dom", "deno.ns"] } } ``` This will enable type checking for both browser and Deno environments, providing type information for global objects like `document` and Deno-specific APIs like `Deno.readFile`. To enable type checking for a **web worker environment in Deno**, (ie code that is run with `new Worker`), you can specify the `deno.worker` library file in the `compilerOptions.lib` option in a `deno.json`. ```json title="deno.json" { "compilerOptions": { "lib": ["deno.worker"] } } ``` To specify the library files to use in a TypeScript file, you can use `/// ` comments: ```ts /// /// ``` ## Augmenting global types Deno supports ambient or global types in TypeScript. This is useful when polyfilling global objects or augmenting the global scope with additional properties. **You should avoid using ambient or global types when possible**, since they can lead to naming conflicts and make it harder to reason about your code. They are also not supported when publishing to JSR. To use ambient or global types in Deno, you can use either the `declare global` syntax, or load a `.d.ts` file that augments the global scope. ### Using declare global to augment the global scope You can use the `declare global` syntax in any of the TypeScript files that are imported in your project to augment the global scope with additional properties. For example: ```ts declare global { interface Window { polyfilledAPI(): string; } } ``` This makes the `polyfilledAPI` function available globally when the type definition is imported. ### Using .d.ts files to augment the global scope You can also use `.d.ts` files to augment the global scope. For example, you can create a `global.d.ts` file with the following content: ```ts interface Window { polyfilledAPI(): string; } ``` Then you can load this `.d.ts` file in your TypeScript using `/// `. This will augment the global scope with the `polyfilledAPI` function. Alternatively you can specify the `.d.ts` file in the `deno.json` configuration file, in the `compilerOptions.types` array: ```json { "compilerOptions": { "types": ["./global.d.ts"] } } ``` This will also augment the global scope with the `polyfilledAPI` function. --- # Web development > A guide to web development with Deno. Learn about supported frameworks like Fresh, Next.js, and Astro, along with built-in features for building modern web applications. URL: https://docs.deno.com/runtime/fundamentals/web_dev Deno offers a secure and developer-friendly environment for building web applications, making your web dev a delightful experience. 1. Deno has [secure defaults](/runtime/fundamentals/security/), meaning it requires explicit permission for file, network, and environment access, reducing the risk of security vulnerabilities. 2. Deno has [built-in TypeScript support](/runtime/fundamentals/typescript/), allowing you to write TypeScript code without additional configuration or tooling. 3. Deno comes with a [standard library](/runtime/fundamentals/standard_library/) that includes modules for common tasks like HTTP servers, file system operations, and more. Most likely, if you're building a more complex application, you'll be interacting with Deno through a web framework. ## React/Next [React](https://reactjs.org/) is a popular JavaScript library for building user interfaces. To use React with Deno, you can use the popular web framework [Next.js](https://nextjs.org/). To get started with Next.js in Deno, you can create a new next app and run it immediately with Deno: ```sh deno run -A npm:create-next-app@latest my-next-app cd my-next-app deno task dev ``` This will create a new Next.js app with TypeScript and run it with Deno. You can then open your browser to `http://localhost:3000` to see your new app, and start editing `page.tsx` to see your changes live. To better understand how JSX and Deno interface under the hood, read on [here](/runtime/reference/jsx/). ## Fresh [Fresh](https://fresh.deno.dev/) is the most popular web framework for Deno. It uses a model where you send no JavaScript to clients by default. To get started with a Fresh app, you can use the following command and follow the cli prompts to create your app: ```sh deno run -A -r https://fresh.deno.dev cd my-fresh-app deno task start ``` This will create a new Fresh app and run it with Deno. You can then open your browser to `http://localhost:8000` to see your new app. Edit `/routes/index.tsx` to see your changes live. Fresh does the majority of its rendering on the server, and the client is only responsible for re-rendering small [islands of interactivity](https://jasonformat.com/islands-architecture/). This means the developer explicitly opts in to client side rendering for specific components. ## Astro [Astro](https://astro.build/) is a static site generator that allows developers to create fast and lightweight websites. To get started with Astro, you can use the following command to create a new Astro site: ```sh deno run -A npm:create-astro my-astro-site cd my-astro-site deno task dev ``` This will create a new Astro site and run it with Deno. You can then open your browser to `http://localhost:4321` to see your new site. Edit `/src/pages/index.astro` to see your changes live. ## Vite [Vite](https://vitejs.dev/) is a web dev build tool that serves your code via native ES modules, which can be run directly in the browser. Vite is a great choice for building modern web applications with Deno. To get started with Vite, you can use the following command to create a new Vite app: ```sh deno run -A npm:create-vite@latest cd my-vite-app deno install deno task dev ``` ## Lume [Lume](https://lume.land/) is a static site generator for Deno that is inspired by other static site generators such Jekyll or Eleventy. To get started with Lume, you can use the following command to create a new Lume site: ```sh mkdir my-lume-site cd my-lume-site deno run -A https://lume.land/init.ts deno task serve ``` ## Docusaurus [Docusaurus](https://docusaurus.io/) is a static site generator that is optimized for technical documentation websites. To get started with Docusaurus, you can use the following command to create a new Docusaurus site: ```sh deno run -A npm:create-docusaurus@latest my-website classic cd my-website deno task start ``` ## Hono [Hono](https://hono.dev) is a light-weight web app framework in the tradition of Express and Sinatra. To get started with Hono, you can use the following command to create a new Hono app: ```sh deno run -A npm:create-hono@latest cd my-hono-app deno task start ``` This will create a new Hono app and run it with Deno. You can then open your browser to `http://localhost:8000` to see your new app. ## Oak [Oak](https://jsr.io/@oak/oak) is a middleware framework for handling HTTP with Deno. Oak is the glue between your frontend application and a potential database or other data sources (e.g. REST APIs, GraphQL APIs). Oak offers additional functionality over the native Deno HTTP server, including a basic router, JSON parser, middlewares, plugins, etc. To get started with Oak, make a file called `server.ts` and add the following: ```ts import { Application } from "jsr:@oak/oak/application"; import { Router } from "jsr:@oak/oak/router"; const router = new Router(); router.get("/", (ctx) => { ctx.response.body = ` Hello oak!

    Hello oak!

    `; }); const app = new Application(); const port = 8080; app.use(router.routes()); app.use(router.allowedMethods()); console.log(`Server running on http://localhost:${port}`); app.listen({ port: port }); ``` Run the server with the following command: ```sh deno run --allow-net server.ts ``` ## Node projects Deno will run your Node.js projects out the box. Check out our guide on [migrating your Node.js project to Deno](/runtime/fundamentals/node/#migrating-from-node.js-to-deno). --- # Workspaces and monorepos > A guide to managing workspaces and monorepos in Deno. Learn about workspace configuration, package management, dependency resolution, and how to structure multi-package projects effectively. URL: https://docs.deno.com/runtime/fundamentals/workspaces Deno supports workspaces, also known as "monorepos", which allow you to manage multiple related and interdependent packages simultaneously. A "workspace" is a collection of folders containing `deno.json` or `package.json` configuration files. The root `deno.json` file defines the workspace: ```json title="deno.json" { "workspace": ["./add", "./subtract"] } ``` This configures a workspace with `add` and `subtract` members, which are directories expected to have `deno.json(c)` and/or `package.json` files. :::info Naming Deno uses `workspace` rather than npm's `workspaces` to represent a singular workspace with multiple members. ::: ## Example Let's expand on the `deno.json` workspace example and see its functionality. The file hierarchy looks like this: ```sh / ├── deno.json ├── main.ts ├── add/ │ ├── deno.json │ └── mod.ts └── subtract/ ├── deno.json └── mod.ts ``` There are two workspace members (add and subtract), each with `mod.ts` files. There is also a root `deno.json` and a `main.ts`. The top-level `deno.json` configuration file defines the workspace and a top-level import map applied to all members: ```json title="deno.json" { "workspace": ["./add", "./subtract"], "imports": { "chalk": "npm:chalk@5" } } ``` The root `main.ts` file uses the `chalk` bare specifier from the import map and imports the `add` and `subtract` functions from the workspace members. Note that it imports them using `@scope/add` and `@scope/subtract`, even though these are not proper URLs and aren't in the import map. How are they resolved? ```ts title="main.ts" import chalk from "chalk"; import { add } from "@scope/add"; import { subtract } from "@scope/subtract"; console.log("1 + 2 =", chalk.green(add(1, 2))); console.log("2 - 4 =", chalk.red(subtract(2, 4))); ``` In the `add/` subdirectory, we define a `deno.json` with a `"name"` field, which is important for referencing the workspace member. The `deno.json` file also contains example configurations, like turning off semicolons when using `deno fmt`. ```json title="add/deno.json" { "name": "@scope/add", "version": "0.1.0", "exports": "./mod.ts", "fmt": { "semiColons": false } } ``` ```ts title="add/mod.ts" export function add(a: number, b: number): number { return a + b; } ``` The `subtract/` subdirectory is similar but does not have the same `deno fmt` configuration. ```json title="subtract/deno.json" { "name": "@scope/subtract", "version": "0.3.0", "exports": "./mod.ts" } ``` ```ts title="subtract/mod.ts" import { add } from "@scope/add"; export function subtract(a: number, b: number): number { return add(a, b * -1); } ``` Let's run it: ```sh > deno run main.ts 1 + 2 = 3 2 - 4 = -2 ``` There's a lot to unpack here, showcasing some of the Deno workspace features: 1. This monorepo consists of two packages, placed in `./add` and `./subtract` directories. 1. By using `name` and `version` options in members' `deno.json` files, it's possible to refer to them using "bare specifiers" across the whole workspace. In this case, the packages are named `@scope/add` and `@scope/subtract`, where `scope` is the "scope" name you can choose. With these two options, it's not necessary to use long and relative file paths in import statements. 1. `npm:chalk@5` package is a shared dependency in the entire workspace. Workspace members "inherit" `imports` of the workspace root, allowing to easily manage a single version of a dependency across the codebase. 1. `add` subdirectory specifies in its `deno.json` that `deno fmt` should not apply semicolons when formatting the code. This makes for a much smoother transition for existing projects, without a need to change tens or hundreds of files in one go. --- Deno workspaces are flexible and can work with Node packages. To make migration for existing Node.js projects easier you can have both Deno-first and Node-first packages in a single workspace. ## Workspace path patterns Deno supports pattern matching for workspace member folders, making it easier to manage workspaces with many members or with a specific directory structure. You can use wildcard patterns to include multiple directories at once: ```json title="deno.json" { "workspace": [ "some-dir/*", "other-dir/*/*" ] } ``` The pattern matching syntax follows specific rules regarding folder depth: `some-path/*` matches files and directories directly within `some-path` (first level of indentation only). For example, with `packages/*`, this includes `packages/foo` and `packages/bar` but not `packages/foo/subpackage`. `some-path/*/*` matches the files and directories located within subdirectories of `some-path` (second level of indentation). It does not match items directly within `some-path`. For example, with `examples/*/*`, this includes `examples/basic/demo` and `examples/advanced/sample` but not `examples/basic`. Each `/*` segment in the pattern corresponds to a specific folder depth relative to the base path. This allows for precise targeting of workspace members at different levels within your directory structure. ## How Deno resolves workspace dependencies When running a project in a workspace that imports from another workspace member, Deno follows these steps to resolve the dependencies: 1. Deno starts in the directory of the executing project (e.g., project A) 2. It looks up in the parent directory for a root `deno.json` file 3. If found, it checks for the `workspace` property in that file 4. For each import statement in project A, Deno checks if the import matches a package name defined in any workspace member's `deno.json` 5. If a matching package name is found, Deno verifies that the containing directory is listed in the root workspace configuration 6. The import is then resolved to the correct file using the `exports` field in the workspace member's `deno.json` For example, given this structure: ```sh / ├── deno.json # workspace: ["./project-a", "./project-b"] ├── project-a/ │ ├── deno.json # name: "@scope/project-a" │ └── mod.ts # imports from "@scope/project-b" └── project-b/ ├── deno.json # name: "@scope/project-b" └── mod.ts ``` When `project-a/mod.ts` imports from `"@scope/project-b"`, Deno: 1. Sees the import statement 2. Checks parent directory's `deno.json` 3. Finds `project-b` in the workspace array 4. Verifies `project-b/deno.json` exists and has matching package name 5. Resolves the import using `project-b`'s exports ### Important note for containerization When containerizing a workspace member that depends on other workspace members, you must include: 1. The root `deno.json` file 2. All dependent workspace packages 3. The same directory structure as your development environment For example, if dockerizing `project-a` above, your Dockerfile should: ```dockerfile COPY deno.json /app/deno.json COPY project-a/ /app/project-a/ COPY project-b/ /app/project-b/ ``` This preserves the workspace resolution mechanism that Deno uses to find and import workspace dependencies. ### Multiple package entries The `exports` property details the entry points and exposes which modules should be importable by users of your package. So far, our package only has a single entry. This is fine for simple packages, but often you'll want to have multiple entries that group relevant aspects of your package. This can be done by passing an `object` instead of a `string` to `exports`: ```json title="my-package/deno.json" { "name": "@scope/my-package", "version": "0.3.0", "exports": { ".": "./mod.ts", "./foo": "./foo.ts", "./other": "./dir/other.ts" } } ``` The `"."` entry is the default entry that's picked when importing `@scope/my-package`. Therefore, the above `deno.json` example provides the following entries: - `@scope/my-package` - `@scope/my-package/foo` - `@scope/my-package/other` ### Publishing workspace packages to registries Workspaces make it easy to publish packages to registries like JSR or NPM. You can publish individual workspace members while keeping their development connected in your monorepo. #### Publishing to JSR To publish a workspace package to JSR, follow these steps: 1. Ensure each package has the appropriate metadata in its `deno.json`: ```json title="my-package/deno.json" { "name": "@scope/my-package", "version": "1.0.0", "exports": "./mod.ts", "publish": { "exclude": ["tests/", "*.test.ts", "examples/"] } } ``` 2. Navigate to the specific package directory and publish: ```sh cd my-package deno publish ``` #### Managing interdependent packages When publishing packages from a workspace with interdependencies, use consistent versioning schemes across related packages. Publish dependent packages first, then the packages that depend on them. After publishing, verify the published packages work as expected: ```sh # Test a published package deno add jsr:@scope/my-published-package deno test integration-test.ts ``` When publishing packages that depend on other workspace members, Deno will automatically replace workspace references with proper registry references in the published code. ### Migrating from `npm` workspaces Deno workspaces support using a Deno-first package from an existing npm package. In this example, we mix and match a Deno library called `@deno/hi`, with a Node.js library called `@deno/log` that we developed a couple years back. We'll need to include a `deno.json` configuration file in the root: ```json title="deno.json" { "workspace": { "members": ["hi"] } } ``` Alongside our existing package.json workspace: ```json title="package.json" { "workspaces": ["log"] } ``` The workspace currently has a log npm package: ```json title="log/package.json" { "name": "@deno/log", "version": "0.5.0", "type": "module", "main": "index.js" } ``` ```js title="log/index.js" export function log(output) { console.log(output); } ``` Let's create an `@deno/hi` Deno-first package that imports `@deno/log`: ```json title="hi/deno.json" { "name": "@deno/hi", "version": "0.2.0", "exports": "./mod.ts", "imports": { "log": "npm:@deno/log@^0.5" } } ``` ```ts title="hi/mod.ts" import { log } from "log"; export function sayHiTo(name: string) { log(`Hi, ${name}!`); } ``` Now, we can write a `main.ts` file that imports and calls `hi`: ```ts title="main.ts" import { sayHiTo } from "@deno/hi"; sayHiTo("friend"); ``` ```sh $ deno run main.ts Hi, friend! ``` You can even have both `deno.json` and `package.json` in your existing Node.js package. Additionally, you could remove the package.json in the root and specify the npm package in the deno.json workspace members. That allows you to gradually migrate to Deno, without putting a lot of upfront work. For example, you can add `log/deno.json` to configure Deno's linter and formatter: ```jsonc { "fmt": { "semiColons": false }, "lint": { "rules": { "exclude": ["no-unused-vars"] } } } ``` Running `deno fmt` in the workspace, will format the `log` package to not have any semicolons, and `deno lint` won't complain if you leave an unused var in one of the source files. ## Configuring built-in Deno tools Some configuration options only make sense at the root of the workspace, eg. specifying `nodeModulesDir` option in one of the members is not available and Deno will warn if an option needs to be applied at the workspace root. Here's a full matrix of various `deno.json` options available at the workspace root and its members: | Option | Workspace | Package | Notes | | ------------------ | --------- | ------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | compilerOptions | ✅ | ✅ | | | importMap | ✅ | ❌ | Exclusive with imports and scopes per config file. Additionally, it is not supported to have importMap in the workspace config, and imports in the package config. | | imports | ✅ | ✅ | Exclusive with importMap per config file. | | scopes | ✅ | ❌ | Exclusive with importMap per config file. | | exclude | ✅ | ✅ | | | lint.include | ✅ | ✅ | | | lint.exclude | ✅ | ✅ | | | lint.files | ⚠️ | ❌ | Deprecated | | lint.rules.tags | ✅ | ✅ | Tags are merged by appending package to workspace list. Duplicates are ignored. | | lint.rules.include | | | | | lint.rules.exclude | ✅ | ✅ | Rules are merged per package, with package taking priority over workspace (package include is stronger than workspace exclude). | | lint.report | ✅ | ❌ | Only one reporter can be active at a time, so allowing different reporters per workspace would not work in the case where you lint files spanning multiple packages. | | fmt.include | ✅ | ✅ | | | fmt.exclude | ✅ | ✅ | | | fmt.files | ⚠️ | ❌ | Deprecated | | fmt.useTabs | ✅ | ✅ | Package takes priority over workspace. | | fmt.indentWidth | ✅ | ✅ | Package takes priority over workspace. | | fmt.singleQuote | ✅ | ✅ | Package takes priority over workspace. | | fmt.proseWrap | ✅ | ✅ | Package takes priority over workspace. | | fmt.semiColons | ✅ | ✅ | Package takes priority over workspace. | | fmt.options.\* | ⚠️ | ❌ | Deprecated | | nodeModulesDir | ✅ | ❌ | Resolution behaviour must be the same in the entire workspace. | | vendor | ✅ | ❌ | Resolution behaviour must be the same in the entire workspace. | | tasks | ✅ | ✅ | Package tasks take priority over workspace. cwd used is the cwd of the config file that the task was inside of. | | test.include | ✅ | ✅ | | | test.exclude | ✅ | ✅ | | | test.files | ⚠️ | ❌ | Deprecated | | publish.include | ✅ | ✅ | | | publish.exclude | ✅ | ✅ | | | bench.include | ✅ | ✅ | | | bench.exclude | ✅ | ✅ | | | bench.files | ⚠️ | ❌ | Deprecated | | lock | ✅ | ❌ | Only a single lock file may exist per resolver, and only resolver may exist per workspace, so conditional enablement of the lockfile per package does not make sense. | | unstable | ✅ | ❌ | For simplicities sake, we do not allow unstable flags, because a lot of the CLI assumes that unstable flags are immutable and global to the entire process. Also weird interaction with DENO_UNSTABLE_\* flags. | | name | ❌ | ✅ | | | version | ❌ | ✅ | | | exports | ❌ | ✅ | | | workspace | ✅ | ❌ | Nested workspaces are not supported. | ## Running commands across workspaces Deno provides several ways to run commands across all or specific workspace members: ### Running tests To run tests across all workspace members, simply execute `deno test` from the workspace root: ```sh deno test ``` This will run tests in all workspace members according to their individual test configurations. To run tests for a specific workspace member, you can either: 1. Change to that member's directory and run the test command: ```sh cd my-directory deno test ``` 2. Or specify the path from the workspace root: ```sh deno test my-directory/ ``` ### Formatting and linting Similar to testing, formatting and linting commands run across all workspace members by default: ```sh deno fmt deno lint ``` Each workspace member follows its own formatting and linting rules as defined in its `deno.json` file, with some settings inherited from the root configuration as shown in the table above. ### Using workspace tasks You can define tasks at both the workspace root and in individual workspace members: ```json title="deno.json" { "workspace": ["./add", "./subtract"], "tasks": { "build": "echo 'Building all packages'", "test:all": "deno test" } } ``` ```json title="add/deno.json" { "name": "@scope/add", "version": "0.1.0", "exports": "./mod.ts", "tasks": { "build": "echo 'Building add package'", "test": "deno test" } } ``` To run a task defined in a specific package: ```sh deno task --cwd=add build ``` ## Sharing and managing dependencies Workspaces provide powerful ways to share and manage dependencies across projects: ### Sharing development dependencies Common development dependencies like testing libraries can be defined at the workspace root: ```json title="deno.json" { "workspace": ["./add", "./subtract"], "imports": { "@std/testing/": "jsr:@std/testing@^0.218.0/", "chai": "npm:chai@^4.3.7" } } ``` This makes these dependencies available to all workspace members without needing to redefine them. ### Managing version conflicts When resolving dependencies, workspace members can override dependencies defined in the root. If both the root and a member specify different versions of the same dependency, the member's version will be used when resolving within that member's folder. This allows individual packages to use specific dependency versions when needed. However, member-specific dependencies are scoped only to that member's folder. Outside of member folders, or when working with files at the workspace root level, the workspace root's import map will be used for resolving dependencies (including JSR and HTTPS dependencies). ### Interdependent workspace members As shown in the earlier example with the `add` and `subtract` modules, workspace members can depend on each other. This enables a clean separation of concerns while maintaining the ability to develop and test interdependent modules together. The `subtract` module imports functionality from the `add` module, demonstrating how workspace members can build upon each other: ```ts title="subtract/mod.ts" import { add } from "@scope/add"; export function subtract(a: number, b: number): number { return add(a, b * -1); } ``` This approach allows you to: 1. Break down complex projects into manageable, single-purpose packages 2. Share code between packages without publishing to a registry 3. Test and develop interdependent modules together 4. Gradually migrate monolithic codebases to modular architecture ## Using workspace protocol in package.json Deno supports workspace protocol specifiers in `package.json` files. These are useful when you have npm packages that depend on other packages within the workspace: ```json title="package.json" { "name": "my-npm-package", "dependencies": { "another-workspace-package": "workspace:*" } } ``` The following workspace protocol specifiers are supported: - `workspace:*` - Use the latest version available in the workspace - `workspace:~` - Use the workspace version with only patch-level changes - `workspace:^` - Use the workspace version with semver-compatible changes ## npm and pnpm workspace compatibility Deno works seamlessly with standard npm workspaces defined in `package.json`: ```json title="package.json" { "workspaces": ["packages/*"] } ``` For pnpm users, Deno supports typical pnpm workspace configurations. However, if you're using a `pnpm-workspace.yaml` file, you'll need to migrate to a `deno.json` workspace configuration: ```yaml title="pnpm-workspace.yaml (to be replaced)" packages: - "packages/*" ``` Should be converted to: ```json title="deno.json" { "workspace": ["packages/*"] } ``` This allows for smooth integration between Deno and npm/pnpm ecosystems during migration or in hybrid projects. For more information on configuring your project, check out the [Configuration with deno.json](/examples/configuration_with_deno_json/) tutorial. --- # Good. We grant net permission to net_client.ts. > A comprehensive guide to using Deno's command-line interface (CLI). Learn about running scripts, managing permissions, using watch mode, and configuring Deno's runtime behavior through command-line flags and options. URL: https://docs.deno.com/runtime/getting_started/command_line_interface Deno is a command line program. The Deno command line interface (CLI) can be used to run scripts, manage dependencies, and even compile your code into standalone executables. You may be familiar with some simple commands having followed the examples thus far. This page will provide a more detailed overview of the Deno CLI. The Deno CLI has a number of subcommands (like `run`, `init` and `test`, etc.). They are used to perform different tasks within the Deno runtime environment. Each subcommand has its own set of flags and options (eg --version) that can be used to customize its behavior. You can view all of the available commands and flags by running the `deno help` subcommand in your terminal, or using the `-h` or `--help` flags. Check out the [CLI reference guide](/runtime/reference/cli/) for a further documentation on all the subcommands and flags available. We'll take a look at a few commands in a bit more detail below to see how they can be used and configured. ## An example subcommand - `deno run` You can run a local TypeScript or JavaScript file by specifying its path relative to the current working directory: ```shell deno run main.ts ``` Deno supports running scripts directly from URLs. This is particularly useful for quickly testing or running code without downloading it first: ```shell deno run https://docs.deno.com/examples/scripts/hello_world.ts ``` You can also run a script by piping it through standard input. This is useful for integrating with other command-line tools or dynamically generating scripts: ```shell cat main.ts | deno run - ``` ## Passing script arguments Script arguments are additional parameters you can pass to your script when running it from the command line. These arguments can be used to customize the behavior of your program based on the input provided at runtime. Arguments should be passed **after** the script name. To test this out we can make a script that will log the arguments passed to it: ```ts title="main.ts" console.log(Deno.args); ``` When we run that script and pass it some arguments it will log them to the console: ```shell $ deno run main.ts arg1 arg2 arg3 [ "arg1", "arg2", "arg3" ] ``` ## Argument and flag ordering _Note that anything passed after the script name will be passed as a script argument and not consumed as a Deno runtime flag._ This leads to the following pitfall: ```shell # Good. We grant net permission to net_client.ts. deno run --allow-net net_client.ts # Bad! --allow-net was passed to Deno.args, throws a net permission error. deno run net_client.ts --allow-net ``` ## Common flags Some flags can be used with multiple related subcommands. We discuss these below. ### Watch mode You can supply the `--watch` flag to `deno run`, `deno test`, and `deno fmt` to enable the built-in file watcher. The watcher enables automatic reloading of your application whenever changes are detected in the source files. This is particularly useful during development, as it allows you to see the effects of your changes immediately without manually restarting the application. The files that are watched will depend on the subcommand used: - for `deno run` and `deno test` the entrypoint, and all local files that the entrypoint statically imports will be watched. - for `deno fmt` all local files and directories specified as command line arguments (or the working directory if no specific files/directories is passed) are watched. ```shell deno run --watch main.ts deno test --watch deno fmt --watch ``` You can exclude paths or patterns from watching by providing the `--watch-exclude` flag. The syntax is `--watch-exclude=path1,path2`. For example: ```shell deno run --watch --watch-exclude=file1.ts,file2.ts main.ts ``` This will exclude file1.ts and file2.ts from being watched. To exclude a pattern, remember to surround it in quotes to prevent your shell from expanding the glob: ```shell deno run --watch --watch-exclude='*.js' main.ts ``` ### Hot Module Replacement mode You can use `--watch-hmr` flag with `deno run` to enable the hot module replacement mode. Instead of restarting the program, the runtime will try to update the program in-place. If updating in-place fails, the program will still be restarted. ```sh deno run --watch-hmr main.ts ``` When a hot module replacement is triggered, the runtime will dispatch a `CustomEvent` of type `hmr` that will include `path` property in its `detail` object. You can listen for this event and perform any additional logic that you need to do when a module is updated (eg. notify a browser over a WebSocket connection). ```ts addEventListener("hmr", (e) => { console.log("HMR triggered", e.detail.path); }); ``` ### Integrity flags (lock files) Affect commands which can download resources to the cache: `deno install`, `deno run`, `deno test`, `deno doc`, and `deno compile`. ```sh --lock Check the specified lock file --frozen[=] Error out if lockfile is out of date ``` Find out more about these [here](/runtime/fundamentals/modules/#integrity-checking-and-lock-files). ### Cache and compilation flags Affect commands which can populate the cache: `deno install`, `deno run`, `deno test`, `deno doc`, and `deno compile`. As well as the flags above, this includes those which affect module resolution, compilation configuration etc. ```sh --config Load configuration file --import-map Load import map file --no-remote Do not resolve remote modules --reload= Reload source code cache (recompile TypeScript) --unstable Enable unstable APIs ``` ### Runtime flags Affect commands which execute user code: `deno run` and `deno test`. These include all of the above as well as the following. ### Type checking flags You can type-check your code (without executing it) using the command: ```shell > deno check main.ts ``` You can also type-check your code before execution by using the `--check` argument to deno run: ```shell > deno run --check main.ts ``` This flag affects `deno run` and `deno eval`. The following table describes the type-checking behavior of various subcommands. Here "Local" means that only errors from local code will induce type-errors, modules imported from https URLs (remote) may have type errors that are not reported. (To turn on type-checking for all modules, use `--check=all`.) | Subcommand | Type checking mode | | -------------- | ------------------ | | `deno bench` | 📁 Local | | `deno check` | 📁 Local | | `deno compile` | 📁 Local | | `deno eval` | ❌ None | | `deno repl` | ❌ None | | `deno run` | ❌ None | | `deno test` | 📁 Local | ### Permission flags These are listed [here](/runtime/fundamentals/security/). ### Other runtime flags More flags which affect the execution environment. ```sh --cached-only Require that remote dependencies are already cached --inspect= activate inspector on host:port ... --inspect-brk= activate inspector on host:port and break at ... --inspect-wait= activate inspector on host:port and wait for ... --location Value of 'globalThis.location' used by some web APIs --prompt Fallback to prompt if required permission wasn't passed --seed Seed Math.random() --v8-flags= Set V8 command line options. For help: ... ``` --- # runtime/getting_started/first_project.md > Step-by-step guide to creating your first Deno project. Learn how to initialize a project, understand the basic file structure, run TypeScript code, and execute tests using Deno's built-in test runner. URL: https://docs.deno.com/runtime/getting_started/first_project Deno has many [built in tools](/runtime/reference/cli/) to make your development experience as smooth as possible. One of these tools is the [project initializer](/runtime/reference/cli/init), which creates a new Deno project with a basic file structure and configuration. While you are welcome to use JavaScript, Deno has built-in support for [TypeScript](https://www.typescriptlang.org/) as well, so we'll be using TypeScript in this guide. If you'd prefer to use JavaScript, you can rename the files to `.js` and remove the type annotations. ## Initialize a new project To initialize a new Deno project, run the following command in your terminal: ```bash deno init my_project ``` This will create a new directory called `my_project` with the following structure: ```plaintext my_project ├── deno.json ├── main_test.ts └── main.ts ``` A `deno.json` file is created to [configure your project](/runtime/fundamentals/configuration/), and two TypeScript files are created; `main.ts` and `main_test.ts`. The `main.ts` file is where you'll write your application code, on initial creation it will contain a simple program which adds two numbers together. The `main_test.ts` file is where you can write tests, initially it will contain a test for your addition program. ## Run your project You can run this program with the following command: ```bash $ deno main.ts Add 2 + 3 = 5 ``` ## Run your tests Deno has a [built in test runner](/runtime/fundamentals/testing/). You can write tests for your code and run them with the `deno test` command. Run the tests in your new project with: ```bash $ deno test running 1 test from ./main_test.ts addTest ... ok (1ms) ok | 1 passed | 0 failed (3ms) ``` Now that you have a basic project set up you can start building your application. Check out our [examples and tutorials](/examples/) for more ideas on what to build with Deno. You can [learn more about using TypeScript in Deno here](/runtime/fundamentals/typescript). --- # Download and install the latest version of Deno > A Guide to installing Deno on different operating systems. Includes instructions for Windows, macOS, and Linux using various package managers, manual installation methods, and Docker containers. URL: https://docs.deno.com/runtime/getting_started/installation Deno works on macOS, Linux, and Windows. Deno is a single binary executable. It has no external dependencies. On macOS, both M1 (arm64) and Intel (x64) executables are provided. On Linux and Windows, only x64 is supported. ## Download and install [deno_install](https://github.com/denoland/deno_install) provides convenience scripts to download and install the binary. Using Shell: ```shell curl -fsSL https://deno.land/install.sh | sh ``` Using [npm](https://npmjs.com/package/deno): ```shell npm install -g deno ``` > The startup time of the Deno command gets affected if it's installed > via npm. We recommend the shell install script for better performance. Using [Homebrew](https://formulae.brew.sh/formula/deno): ```shell brew install deno ``` Using [MacPorts](https://ports.macports.org/port/deno/): ```shell sudo port install deno ``` Using [Nix](https://nixos.org/download.html): ```shell nix-shell -p deno ``` Using [asdf](https://asdf-vm.com/): ```shell asdf plugin add deno https://github.com/asdf-community/asdf-deno.git # Download and install the latest version of Deno asdf install deno latest # To set as the default version of Deno globally asdf set -u deno latest # To set as the default version of Deno locally (current project only) asdf set deno latest ``` Using [vfox](https://vfox.lhan.me/): ```shell vfox add deno # Download and install the latest version of Deno vfox install deno@latest # To set the version of Deno globally vfox use --global deno ``` Using PowerShell (Windows): ```powershell irm https://deno.land/install.ps1 | iex ``` Using [npm](https://npmjs.com/package/deno): ```shell npm install -g deno ``` > The startup time of the Deno command gets affected if it's installed > via npm. We recommend the PowerShell install script for better > performance. Using [Scoop](https://scoop.sh/): ```shell scoop install deno ``` Using [Chocolatey](https://chocolatey.org/packages/deno): ```shell choco install deno ``` Using [Winget](https://github.com/microsoft/winget-cli): ```shell winget install DenoLand.Deno ``` Using [vfox](https://vfox.lhan.me/): ```shell vfox add deno # Download and install the latest version of Deno vfox install deno@latest # To set the version of Deno globally vfox use --global deno ``` Using Shell: ```shell curl -fsSL https://deno.land/install.sh | sh ``` Using [npm](https://npmjs.com/package/deno): ```shell npm install -g deno ``` > The startup time of the Deno command gets affected if it's installed > via npm. We recommend the shell install script for better performance. Using [Nix](https://nixos.org/download.html): ```shell nix-shell -p deno ``` Using [asdf](https://asdf-vm.com/): ```shell asdf plugin add deno https://github.com/asdf-community/asdf-deno.git # Download and install the latest version of Deno asdf install deno latest # To set as the default version of Deno globally asdf set -u deno latest # To set as the default version of Deno locally (current project only) asdf set deno latest ``` Using [vfox](https://vfox.lhan.me/): ```shell vfox add deno # Download and install the latest version of Deno vfox install deno@latest # To set the version of Deno globally vfox use --global deno ``` You can also build and install from source using [Cargo](https://crates.io/crates/deno): ```shell cargo install deno --locked ``` Deno binaries can also be installed manually, by downloading a zip file at [github.com/denoland/deno/releases](https://github.com/denoland/deno/releases). These packages contain just a single executable file. You will have to set the executable bit on macOS and Linux. ## Docker For more information and instructions on the official Docker images: [https://github.com/denoland/deno_docker](https://github.com/denoland/deno_docker) ## Testing your installation To test your installation, run `deno --version`. If this prints the Deno version to the console the installation was successful. Use `deno help` to see help text documenting Deno's flags and usage. Get a detailed guide on the CLI [here](/runtime/getting_started/command_line_interface/). ## Updating To update a previously installed version of Deno, you can run: ```shell deno upgrade ``` Or using [Winget](https://github.com/microsoft/winget-cli) (Windows): ```shell winget upgrade DenoLand.Deno ``` This will fetch the latest release from [github.com/denoland/deno/releases](https://github.com/denoland/deno/releases), unzip it, and replace your current executable with it. You can also use this utility to install a specific version of Deno: ```shell deno upgrade --version 1.0.1 ``` ## Building from source Information about how to build from source can be found in the [`Building from source`](https://github.com/denoland/deno/blob/main/.github/CONTRIBUTING.md#building-from-source) guide. --- # Set up your environment > A guide to setting up your development environment for Deno. Learn how to configure popular editors like VS Code, set up language server support, and enable shell completions for better productivity. URL: https://docs.deno.com/runtime/getting_started/setup_your_environment Deno comes with many of the tools that are commonly needed for developing applications, including a full [language server (LSP)](/runtime/reference/cli/lsp/) to help power your IDE of choice. This page will help you set up your environment to get the most out of Deno while you are developing. We'll cover: - How to use Deno with your favorite editor/IDE - How to generate shell completions ## Setting up your editor/IDE ### Visual Studio Code If you haven’t already, download and install Visual Studio Code from the [official website](https://code.visualstudio.com/). In the Extensions tab, search for "Deno" and install the [extension by Denoland](https://marketplace.visualstudio.com/items?itemName=denoland.vscode-deno). Next, open the Command Palette by pressing `Ctrl+Shift+P` and type `Deno: Initialize Workspace Configuration`. Select this option to configure Deno for your workspace. ![The VSCode command palette with the Deno: Initialize Workspace Configuration option selected.](./images/vscode-setup.png) A file called `.vscode/settings.json` will be created in your workspace with the following configuration: ```json { "deno.enable": true } ``` That’s it! You’ve successfully set up your developer environment for Deno using VSCode. You will now get all the benefits of Deno’s LSP, including IntelliSense, code formatting, linting, and more. ### JetBrains IDEs To install the Deno Plugin, open your IDE and go to **File** > **Settings**. Navigate to **Plugins** and search for `Deno`. Install the official Deno plugin. ![The WebStorm plugins settings](./images/webstorm_setup.png) To configure the Plugin, go to **File** > **Settings** again. Navigate to **Languages & Frameworks** > **Deno**. Check **Enable Deno for your project** and specify the path to the Deno executable (if it has not been auto-detected). Check out [this blog post](https://blog.jetbrains.com/webstorm/2020/06/deno-support-in-jetbrains-ides/) to learn more about how to get started with Deno in Jetbrains IDEs. ### Vim/Neovim via plugins Deno is well-supported on both [Vim](https://www.vim.org/) and [Neovim](https://neovim.io/) via [coc.nvim](https://github.com/neoclide/coc.nvim), [vim-easycomplete](https://github.com/jayli/vim-easycomplete), [ALE](https://github.com/dense-analysis/ale) and [vim-lsp](https://github.com/prabirshrestha/vim-lsp). coc.nvim offers plugins to integrate to the Deno language server while ALE supports it _out of the box_. ### Neovim 0.6+ using the built-in language server To use the Deno language server install [nvim-lspconfig](https://github.com/neovim/nvim-lspconfig/) and follow the instructions to enable the [supplied Deno configuration](https://github.com/neovim/nvim-lspconfig/blob/master/doc/configs.md#denols). Note that if you also have `ts_ls` as an LSP client, you may run into issues where both `ts_ls` and `denols` are attached to your current buffer. To resolve this, make sure to set some unique `root_dir` for both `ts_ls` and `denols`. You may also need to set `single_file_support` to `false` for `ts_ls` to prevent it from running in `single file mode`. Here is an example of such a configuration: ```lua local nvim_lsp = require('lspconfig') nvim_lsp.denols.setup { on_attach = on_attach, root_dir = nvim_lsp.util.root_pattern("deno.json", "deno.jsonc"), } nvim_lsp.ts_ls.setup { on_attach = on_attach, root_dir = nvim_lsp.util.root_pattern("package.json"), single_file_support = false } ``` For Deno, the example above assumes a `deno.json` or `deno.jsonc` file exists at the root of the project. ##### Kickstart.nvim and Mason LSP If you are using [kickstart.nvim](https://github.com/nvim-lua/kickstart.nvim) add the above configuration like this inside the servers table in your configuration `init.lua`. ```lua local servers = { -- ... some configuration ts_ls = { root_dir = require("lspconfig").util.root_pattern({ "package.json", "tsconfig.json" }), single_file_support = false, settings = {}, }, denols = { root_dir = require("lspconfig").util.root_pattern({"deno.json", "deno.jsonc"}), single_file_support = false, settings = {}, }, } ``` #### coc.nvim Once you have [coc.nvim](https://github.com/neoclide/coc.nvim/wiki/Install-coc.nvim) installed, you need to install the required [coc-deno](https://github.com/fannheyward/coc-deno) via `:CocInstall coc-deno`. Once the plugin is installed, and you want to enable Deno for a workspace, run the command `:CocCommand deno.initializeWorkspace` and you should be able to utilize commands like `gd` (goto definition) and `gr` (go/find references). #### ALE ALE supports Deno via the Deno language server out of the box and in many uses cases doesn't require additional configuration. Once you have [ALE installed](https://github.com/dense-analysis/ale#installation) you can perform the command [`:help ale-typescript-deno`](https://github.com/dense-analysis/ale/blob/master/doc/ale-typescript.txt) to get information on the configuration options available. For more information on how to setup ALE (like key bindings) refer to the [official documentation](https://github.com/dense-analysis/ale#usage). #### Vim-EasyComplete Vim-EasyComplete supports Deno without any other configuration. Once you have [vim-easycomplete installed](https://github.com/jayli/vim-easycomplete#installation), you need install deno via `:InstallLspServer deno` if you haven't installed deno. You can get more information from [official documentation](https://github.com/jayli/vim-easycomplete). #### Vim-Lsp After installing Vim-Lsp through [vim-plug](https://github.com/prabirshrestha/vim-lsp?tab=readme-ov-file#installing) or vim packages. Add this code to your `.vimrc` configuration: ```vim if executable('deno') let server_config = { \ 'name': 'deno', \ 'cmd': {server_info->['deno', 'lsp']}, \ 'allowlist': ['typescript', 'javascript', 'javascriptreact', 'typescriptreact'], \ } if exists('$DENO_ENABLE') let deno_enabled = $DENO_ENABLE == '1' let server_config['workspace_config'] = { 'deno': { 'enable': deno_enabled ? v:true : v:false } } endif au User lsp_setup call lsp#register_server(server_config) endif ``` You will have two ways to enable the LSP Server. One is to have a `deno.json` or `deno.jsonc` in your current working directory, or force it with `DENO_ENABLE=1`. Also if you want to highlight syntax in the intellisense tooltip, you can add this code to your `.vimrc` configuration too: ```vim let g:markdown_fenced_languages = ["ts=typescript"] ``` ### Emacs #### lsp-mode Emacs supports Deno via the Deno language server using [lsp-mode](https://emacs-lsp.github.io/lsp-mode/). Once [lsp-mode is installed](https://emacs-lsp.github.io/lsp-mode/page/installation/) it should support Deno, which can be [configured](https://emacs-lsp.github.io/lsp-mode/page/lsp-deno/) to support various settings. #### eglot You can also use built-in Deno language server by using [`eglot`](https://github.com/joaotavora/eglot). An example configuration for Deno via eglot: ```elisp (add-to-list 'eglot-server-programs '((js-mode typescript-mode) . (eglot-deno "deno" "lsp"))) (defclass eglot-deno (eglot-lsp-server) () :documentation "A custom class for deno lsp.") (cl-defmethod eglot-initialization-options ((server eglot-deno)) "Passes through required deno initialization options" (list :enable t :unstable t :typescript (:inlayHints (:variableTypes (:enabled t)) (:parameterTypes (:enabled t))))) ``` This is the equivalent of having the following settings in a VSCode `settings.json`: ```jsonc { "deno.enable": true, "deno.unstable": true, "typescript.inlayHints.variableTypes.enabled": true, "typescript.inlayHints.parameterTypes.enabled": true } ``` ### Pulsar The [Pulsar editor, formerly known as Atom](https://pulsar-edit.dev/) supports integrating with the Deno language server via the [atom-ide-deno](https://web.pulsar-edit.dev/packages/atom-ide-deno) package. `atom-ide-deno` requires that the Deno CLI be installed and the [atom-ide-base](https://web.pulsar-edit.dev/packages/atom-ide-base) package to be installed as well. ### Sublime Text [Sublime Text](https://www.sublimetext.com/) supports connecting to the Deno language server via the [LSP package](https://packagecontrol.io/packages/LSP). You may also want to install the [TypeScript package](https://packagecontrol.io/packages/TypeScript) to get full syntax highlighting. Once you have the LSP package installed, you will want to add configuration to your `.sublime-project` configuration like the below: ```jsonc { "settings": { "LSP": { "deno": { "command": ["deno", "lsp"], "initializationOptions": { // "config": "", // Sets the path for the config file in your project "enable": true, // "importMap": "", // Sets the path for the import-map in your project "lint": true, "unstable": false }, "enabled": true, "languages": [ { "languageId": "javascript", "scopes": ["source.js"], "syntaxes": [ "Packages/Babel/JavaScript (Babel).sublime-syntax", "Packages/JavaScript/JavaScript.sublime-syntax" ] }, { "languageId": "javascriptreact", "scopes": ["source.jsx"], "syntaxes": [ "Packages/Babel/JavaScript (Babel).sublime-syntax", "Packages/JavaScript/JavaScript.sublime-syntax" ] }, { "languageId": "typescript", "scopes": ["source.ts"], "syntaxes": [ "Packages/TypeScript-TmLanguage/TypeScript.tmLanguage", "Packages/TypeScript Syntax/TypeScript.tmLanguage" ] }, { "languageId": "typescriptreact", "scopes": ["source.tsx"], "syntaxes": [ "Packages/TypeScript-TmLanguage/TypeScriptReact.tmLanguage", "Packages/TypeScript Syntax/TypeScriptReact.tmLanguage" ] } ] } } } } ``` ### Nova The [Nova editor](https://nova.app) can integrate the Deno language server via the [Deno extension](https://extensions.panic.com/extensions/co.gwil/co.gwil.deno/). ### GitHub Codespaces [GitHub Codespaces](https://github.com/features/codespaces) allows you to develop fully online or remotely on your local machine without needing to configure or install Deno. It is currently in early access. If a project is a Deno enabled project and contains the `.devcontainer` configuration as part of the repository, opening the project in GitHub Codespaces should just "work". If you are starting a new project, or you want to add Deno support to an existing code space, it can be added by selecting the `Codespaces: Add Development Container Configuration Files...` from the command pallet and then selecting `Show All Definitions...` and then searching for the `Deno` definition. Once selected, you will need to rebuild your container so that the Deno CLI is added to the container. After the container is rebuilt, the code space will support Deno. ### Kakoune [Kakoune](https://kakoune.org/) supports connecting to the Deno language server via the [kak-lsp](https://github.com/kak-lsp/kak-lsp) client. Once [kak-lsp is installed](https://github.com/kak-lsp/kak-lsp#installation) an example of configuring it up to connect to the Deno language server is by adding the following to your `kak-lsp.toml`: ```toml [language.typescript] filetypes = ["typescript", "javascript"] roots = [".git"] command = "deno" args = ["lsp"] [language.typescript.settings.deno] enable = true lint = true ``` ### Helix [Helix](https://helix-editor.com) comes with built-in language server support. Enabling connection to the Deno language server requires changes in the `languages.toml` configuration file. ```toml [[language]] name = "typescript" roots = ["deno.json", "deno.jsonc", "package.json"] file-types = ["ts", "tsx"] auto-format = true language-servers = ["deno-lsp"] [[language]] name = "javascript" roots = ["deno.json", "deno.jsonc", "package.json"] file-types = ["js", "jsx"] auto-format = true language-servers = ["deno-lsp"] [language-server.deno-lsp] command = "deno" args = ["lsp"] config.deno.enable = true ``` ### Zed The [Zed editor](https://zed.dev) can integrate the Deno language server via the [Deno extension](https://zed.dev/extensions?query=deno&filter=language-servers). ## Shell completions Built into the Deno CLI is support to generate shell completion information for the CLI itself. By using `deno completions `, the Deno CLI will output to stdout the completions. Current shells that are supported: - bash - elvish - fish - powershell - zsh ### bash example Output the completions and add them to the environment: ```shell > deno completions bash > /usr/local/etc/bash_completion.d/deno.bash > source /usr/local/etc/bash_completion.d/deno.bash ``` ### PowerShell example Output the completions: ```shell > deno completions powershell >> $profile > .$profile ``` This will create a Powershell profile at `$HOME\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1`, and it will be run whenever you launch the PowerShell. ### zsh example You should have a directory where the completions can be saved: ```shell > mkdir ~/.zsh ``` Then output the completions: ```shell > deno completions zsh > ~/.zsh/_deno ``` And ensure the completions get loaded in your `~/.zshrc`: ```shell fpath=(~/.zsh $fpath) autoload -Uz compinit compinit -u ``` If after reloading your shell and completions are still not loading, you may need to remove `~/.zcompdump/` to remove previously generated completions and then `compinit` to generate them again. ### zsh example with ohmyzsh and antigen [ohmyzsh](https://github.com/ohmyzsh/ohmyzsh) is a configuration framework for zsh and can make it easier to manage your shell configuration. [antigen](https://github.com/zsh-users/antigen) is a plugin manager for zsh. Create the directory to store the completions and output the completions: ```shell > mkdir ~/.oh-my-zsh/custom/plugins/deno > deno completions zsh > ~/.oh-my-zsh/custom/plugins/deno/_deno ``` Then your `.zshrc` might look something like this: ```shell source /path-to-antigen/antigen.zsh # Load the oh-my-zsh's library. antigen use oh-my-zsh antigen bundle deno ``` ### fish example Output the completions to a `deno.fish` file into the completions directory in the fish config folder: ```shell > deno completions fish > ~/.config/fish/completions/deno.fish ``` ## Other tools If you are writing or supporting a community integration using the Deno language server, read more about [integrating with the Deno LSP](/runtime/reference/lsp_integration/), but also feel free to join our [Discord community](https://discord.gg/deno) in the `#dev-lsp` channel. --- # Where to get help > Guide to getting help with Deno. Find community resources, support channels, discussion forums, and how to engage with the Deno community for troubleshooting and assistance. URL: https://docs.deno.com/runtime/help Stuck? Lost? Get Help from the Deno Community. ## [Community Discord](https://discord.gg/deno) Ask questions and chat with community members in real-time. ## [Stack Overflow](https://stackoverflow.com/questions/tagged/deno) Stack Overflow is a popular forum to ask code-level questions or if you're stuck with a specific error. [Ask your own!](https://stackoverflow.com/questions/ask?tags=deno) ## [DEV's Deno Community](https://dev.to/t/deno) A great place to find interesting articles about best practices, application architecture and new learnings. Post your articles with the tag `deno`. ## Examples and Tutorials Deno provides a wide range of examples and tutorials that might address your problem: - [Deno by Example](/examples/): Practical code snippets for common tasks --- # Getting Started > A step-by-step guide to getting started with Deno. Learn how to install Deno, create your first program, and understand the basics of this secure JavaScript, TypeScript, and WebAssembly runtime. URL: https://docs.deno.com/runtime/ [Deno](https://deno.com) ([/ˈdiːnoʊ/](https://ipa-reader.com/?text=%CB%88di%CB%90no%CA%8A), pronounced `dee-no`) is an [open source](https://github.com/denoland/deno/blob/main/LICENSE.md) JavaScript, TypeScript, and WebAssembly runtime with secure defaults and a great developer experience. It's built on [V8](https://v8.dev/), [Rust](https://www.rust-lang.org/), and [Tokio](https://tokio.rs/). Let's create and run your first Deno program in under five minutes. ## Install Deno Install the Deno runtime on your system using one of the terminal commands below. ```sh curl -fsSL https://deno.land/install.sh | sh ``` In Windows PowerShell: ```powershell irm https://deno.land/install.ps1 | iex ``` ```sh curl -fsSL https://deno.land/install.sh | sh ``` [Additional installation options can be found here](/runtime/getting_started/installation/). After installation, you should have the `deno` executable available on your system path. You can verify the installation by running: ```sh deno --version ``` ## Hello World Deno can run JavaScript and [TypeScript](https://www.typescriptlang.org/) with no additional tools or configuration required. Let's create a simple "hello world" program and run it with Deno. Create a TypeScript or JavaScript file called `main` and include the following code: ```ts title="main.ts" function greet(name: string): string { return `Hello, ${name}!`; } console.log(greet("world")); ``` ```js title="main.js" function greet(name) { return `Hello, ${name}!`; } console.log(greet("world")); ``` Save the file and run it with Deno: ```sh $ deno main.ts Hello, world! ``` ```sh $ deno main.js Hello, world! ``` ## Next Steps Congratulations! You've just run your first Deno program. Read on to learn more about the Deno runtime. - [Making a Deno project](/runtime/getting_started/first_project/) - [Setting up your environment](/runtime/getting_started/setup_your_environment/) - [Using the CLI](/runtime/getting_started/command_line_interface) --- # deno add > Add and manage project dependencies with Deno. URL: https://docs.deno.com/runtime/reference/cli/add --- # `deno bench`, benchmarking tool > Run benchmarks using Deno's built-in bench tool. URL: https://docs.deno.com/runtime/reference/cli/bench ## Quickstart Firstly, let's create a file `url_bench.ts` and register a bench using the `Deno.bench()` function. ```ts // url_bench.ts Deno.bench("URL parsing", () => { new URL("https://deno.land"); }); ``` Secondly, run the benchmark using the `deno bench` subcommand. ```sh deno bench url_bench.ts cpu: Apple M1 Max runtime: deno 1.21.0 (aarch64-apple-darwin) file:///dev/deno/url_bench.ts benchmark time (avg) (min … max) p75 p99 p995 --------------------------------------------------- ----------------------------- URL parsing 17.29 µs/iter (16.67 µs … 153.62 µs) 17.25 µs 18.92 µs 22.25 µs ``` ## Writing benchmarks To define a benchmark you need to register it with a call to the `Deno.bench` API. There are multiple overloads of this API to allow for the greatest flexibility and easy switching between the forms (eg. when you need to quickly focus a single bench for debugging, using the `only: true` option): ```ts // Compact form: name and function Deno.bench("hello world #1", () => { new URL("https://deno.land"); }); // Compact form: named function. Deno.bench(function helloWorld3() { new URL("https://deno.land"); }); // Longer form: bench definition. Deno.bench({ name: "hello world #2", fn: () => { new URL("https://deno.land"); }, }); // Similar to compact form, with additional configuration as a second argument. Deno.bench("hello world #4", { permissions: { read: true } }, () => { new URL("https://deno.land"); }); // Similar to longer form, with bench function as a second argument. Deno.bench( { name: "hello world #5", permissions: { read: true } }, () => { new URL("https://deno.land"); }, ); // Similar to longer form, with a named bench function as a second argument. Deno.bench({ permissions: { read: true } }, function helloWorld6() { new URL("https://deno.land"); }); ``` ### Async functions You can also bench asynchronous code by passing a bench function that returns a promise. For this you can use the `async` keyword when defining a function: ```ts Deno.bench("async hello world", async () => { await 1; }); ``` ### Critical sections Sometimes the benchmark case needs to include setup and teardown code that would taint the benchmark results. For example, if you want to measure how long it takes to read a small file, you need to open the file, read it, and then close it. If the file is small enough the time it takes to open and close the file might outweigh the time it takes to read the file itself. To help with such situations you can `Deno.BenchContext.start` and `Deno.BenchContext.end` to tell the benchmarking tool about the critical section you want to measure. Everything outside of the section between these two calls will be excluded from the measurement. ```ts Deno.bench("foo", async (b) => { // Open a file that we will act upon. using file = await Deno.open("a_big_data_file.txt"); // Tell the benchmarking tool that this is the only section you want // to measure. b.start(); // Now let's measure how long it takes to read all of the data from the file. await new Response(file.readable).arrayBuffer(); // End measurement here. b.end(); }); ``` The above example requires the `--allow-read` flag to run the benchmark: `deno bench --allow-read file_reading.ts`. ## Grouping and baselines When registering a bench case, it can be assigned to a group, using `Deno.BenchDefinition.group` option: ```ts // url_bench.ts Deno.bench("url parse", { group: "url" }, () => { new URL("https://deno.land"); }); ``` It is useful to assign several cases to a single group and compare how they perform against a "baseline" case. In this example we'll check how performant is `Date.now()` compared to `performance.now()`, to do that we'll mark the first case as a "baseline" using `Deno.BenchDefinition.baseline` option: ```ts // time_bench.ts Deno.bench("Date.now()", { group: "timing", baseline: true }, () => { Date.now(); }); Deno.bench("performance.now()", { group: "timing" }, () => { performance.now(); }); ``` ```shellsesssion $ deno bench time_bench.ts cpu: Apple M1 Max runtime: deno 1.21.0 (aarch64-apple-darwin) file:///dev/deno/time_bench.ts benchmark time (avg) (min … max) p75 p99 p995 --------------------------------------------------------- ----------------------------- Date.now() 125.24 ns/iter (118.98 ns … 559.95 ns) 123.62 ns 150.69 ns 156.63 ns performance.now() 2.67 µs/iter (2.64 µs … 2.82 µs) 2.67 µs 2.82 µs 2.82 µs summary Date.now() 21.29x times faster than performance.now() ``` You can specify multiple groups in the same file. ## Running benchmarks To run a benchmark, call `deno bench` with the file that contains your bench function. You can also omit the file name, in which case all benchmarks in the current directory (recursively) that match the glob `{*_,*.,}bench.{ts, tsx, mts, js, mjs, jsx}` will be run. If you pass a directory, all files in the directory that match this glob will be run. The glob expands to: - files named `bench.{ts, tsx, mts, js, mjs, jsx}`, - or files ending with `.bench.{ts, tsx, mts, js, mjs, jsx}`, - or files ending with `_bench.{ts, tsx, mts, js, mjs, jsx}` ```shell # Run all benches in the current directory and all sub-directories deno bench # Run all benches in the util directory deno bench util/ # Run just my_bench.ts deno bench my_bench.ts ``` > ⚠️ If you want to pass additional CLI arguments to the bench files use `--` to > inform Deno that remaining arguments are scripts arguments. ```shell # Pass additional arguments to the bench file deno bench my_bench.ts -- -e --foo --bar ``` `deno bench` uses the same permission model as `deno run` and therefore will require, for example, `--allow-write` to write to the file system during benching. To see all runtime options with `deno bench`, you can reference the command line help: ```shell deno help bench ``` ## Filtering There are a number of options to filter the benches you are running. ### Command line filtering Benches can be run individually or in groups using the command line `--filter` option. The filter flags accept a string or a pattern as value. Assuming the following benches: ```ts Deno.bench({ name: "my-bench", fn: () => {/* bench function zero */}, }); Deno.bench({ name: "bench-1", fn: () => {/* bench function one */}, }); Deno.bench({ name: "bench2", fn: () => {/* bench function two */}, }); ``` This command will run all of these benches because they all contain the word "bench". ```shell deno bench --filter "bench" benchmarks/ ``` On the flip side, the following command uses a pattern and will run the second and third benchmarks. ```shell deno bench --filter "/bench-*\d/" benchmarks/ ``` _To let Deno know that you want to use a pattern, wrap your filter with forward-slashes like the JavaScript syntactic sugar for a regex._ ### Bench definition filtering Within the benches themselves, you have two options for filtering. #### Filtering out (ignoring these benches) Sometimes you want to ignore benches based on some sort of condition (for example you only want a benchmark to run on Windows). For this you can use the `ignore` boolean in the bench definition. If it is set to true the bench will be skipped. ```ts Deno.bench({ name: "bench windows feature", ignore: Deno.build.os !== "windows", fn() { // do windows feature }, }); ``` #### Filtering in (only run these benches) Sometimes you may be in the middle of a performance problem within a large bench class and you would like to focus on just that single bench and ignore the rest for now. For this you can use the `only` option to tell the benchmark harness to only run benches with this set to true. Multiple benches can set this option. While the benchmark run will report on the success or failure of each bench, the overall benchmark run will always fail if any bench is flagged with `only`, as this is a temporary measure only which disables nearly all of your benchmarks. ```ts Deno.bench({ name: "Focus on this bench only", only: true, fn() { // bench complicated stuff }, }); ``` ## JSON output To retrieve the output as JSON, use the `--json` flag: ``` $ deno bench --json bench_me.js { "runtime": "Deno/1.31.0 x86_64-apple-darwin", "cpu": "Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz", "benches": [ "origin": "file:///dev/bench_me.js", "group": null, "name": "Deno.UnsafePointerView#getUint32", "baseline": false, "result": { "ok": { "n": 49, "min": 1251.9348, "max": 1441.2696, "avg": 1308.7523755102038, "p75": 1324.1055, "p99": 1441.2696, "p995": 1441.2696, "p999": 1441.2696 } } ] } ``` --- # Bundler (deprecated) URL: https://docs.deno.com/runtime/reference/cli/bundle :::caution `deno bundle` has been deprecated and will be removed in some future release. Use [deno_emit](https://github.com/denoland/deno_emit), [esbuild](https://esbuild.github.io/) or [rollup](https://rollupjs.org) instead. ::: `deno bundle [URL]` will output a single JavaScript file for consumption in Deno, which includes all dependencies of the specified input. For example: ```bash $ deno bundle https://deno.land/std@0.190.0/examples/colors.tsts colors.bundle.js Bundle https://deno.land/std@0.190.0/examples/colors.ts Download https://deno.land/std@0.190.0/examples/colors.ts Download https://deno.land/std@0.190.0/fmt/colors.ts Emit "colors.bundle.js" (9.83KB) ``` If you omit the out file, the bundle will be sent to `stdout`. The bundle can just be run as any other module in Deno would: ```bash deno run colors.bundle.js ``` The output is a self contained ES Module, where any exports from the main module supplied on the command line will be available. For example, if the main module looked something like this: ```ts export { foo } from "./foo.js"; export const bar = "bar"; ``` It could be imported like this: ```ts import { bar, foo } from "./lib.bundle.js"; ``` ## Bundling for the Web The output of `deno bundle` is intended for consumption in Deno and not for use in a web browser or other runtimes. That said, depending on the input it may work in other environments. If you wish to bundle for the web, we recommend other solutions such as [esbuild](https://esbuild.github.io/). --- # deno check > Download and type-check code without execution URL: https://docs.deno.com/runtime/reference/cli/check ## Example Type-check without execution. ```ts title="example.ts" const x: string = 1 + 1n; ``` ```bash deno check example.ts ``` --- # deno clean > Remove cached dependencies for a clean start URL: https://docs.deno.com/runtime/reference/cli/clean --- # `deno compile`, standalone executables > Compile your code into a standalone executable URL: https://docs.deno.com/runtime/reference/cli/compile ## Flags As with [`deno install`](/runtime/reference/cli/install/), the runtime flags used to execute the script must be specified at compilation time. This includes permission flags. ```sh deno compile --allow-read --allow-net jsr:@std/http/file-server ``` [Script arguments](/runtime/getting_started/command_line_interface/#passing-script-arguments) can be partially embedded. ```console deno compile --allow-read --allow-net jsr:@std/http/file-server -p 8080 ./file_server --help ``` ## Cross Compilation You can cross-compile binaries for other platforms by using the `--target` flag. ``` # Cross compile for Apple Silicon deno compile --target aarch64-apple-darwin main.ts # Cross compile for Windows with an icon deno compile --target x86_64-pc-windows-msvc --icon ./icon.ico main.ts ``` ### Supported Targets Deno supports cross compiling to all targets regardless of the host platform. | OS | Architecture | Target | | ------- | ------------ | --------------------------- | | Windows | x86_64 | `x86_64-pc-windows-msvc` | | macOS | x86_64 | `x86_64-apple-darwin` | | macOS | ARM64 | `aarch64-apple-darwin` | | Linux | x86_64 | `x86_64-unknown-linux-gnu` | | Linux | ARM64 | `aarch64-unknown-linux-gnu` | ## Icons It is possible to add an icon to the executable by using the `--icon` flag when targeting Windows. The icon must be in the `.ico` format. ``` deno compile --icon icon.ico main.ts # Cross compilation with icon deno compile --target x86_64-pc-windows-msvc --icon ./icon.ico main.ts ``` ## Dynamic Imports By default, statically analyzable dynamic imports (imports that have the string literal within the `import("...")` call expression) will be included in the output. ```ts // calculator.ts and its dependencies will be included in the binary const calculator = await import("./calculator.ts"); ``` But non-statically analyzable dynamic imports won't: ```ts const specifier = condition ? "./calc.ts" : "./better_calc.ts"; const calculator = await import(specifier); ``` To include non-statically analyzable dynamic imports, specify an `--include ` flag. ```shell deno compile --include calc.ts --include better_calc.ts main.ts ``` ## Including Data Files or Directories Starting in Deno 2.1, you can include files or directories in the executable by specifying them via the `--include ` flag. ```shell deno compile --include names.csv --include data main.ts ``` Then read the file relative to the directory path of the current module via `import.meta.dirname`: ```ts // main.ts const names = Deno.readTextFileSync(import.meta.dirname + "/names.csv"); const dataFiles = Deno.readDirSync(import.meta.dirname + "/data"); // use names and dataFiles here ``` Note this currently only works for files on the file system and not remote files. ## Workers Similarly to non-statically analyzable dynamic imports, code for [workers](../web_platform_apis/#web-workers) is not included in the compiled executable by default. There are two ways to include workers: 1. Use the `--include ` flag to include the worker code. ```shell deno compile --include worker.ts main.ts ``` 2. Import worker module using a statically analyzable import. ```ts // main.ts import "./worker.ts"; ``` ```shell deno compile main.ts ``` ## Code Signing ### macOS By default, on macOS, the compiled executable will be signed using an ad-hoc signature which is the equivalent of running `codesign -s -`: ```shell $ deno compile -o main main.ts $ codesign --verify -vv ./main ./main: valid on disk ./main: satisfies its Designated Requirement ``` You can specify a signing identity when code signing the executable just like you would do with any other macOS executable: ```shell codesign -s "Developer ID Application: Your Name" ./main ``` Refer to the [official documentation](https://developer.apple.com/documentation/security/notarizing-macos-software-before-distribution) for more information on codesigning and notarization on macOS. ### Windows On Windows, the compiled executable can be signed using the `SignTool.exe` utility. ```shell $ deno compile -o main.exe main.ts $ signtool sign /fd SHA256 main.exe ``` ## Unavailable in executables - [Web Storage API](/runtime/reference/web_platform_apis/#web-storage) - [Web Cache](/api/web/~/Cache) --- # deno completions > Generate shell completions for Deno URL: https://docs.deno.com/runtime/reference/cli/completions You can use the output script to configure autocompletion for `deno` commands. For example: `deno un` -> Tab -> `deno uninstall`. ## Examples ### Configure Bash shell completion ```bash deno completions bash > deno.bash if [ -d "/usr/local/etc/bash_completion.d/" ]; then sudo mv deno.bash /usr/local/etc/bash_completion.d/ source /usr/local/etc/bash_completion.d/deno.bash elif [ -d "/usr/share/bash-completion/completions/" ]; then sudo mv deno.bash /usr/share/bash-completion/completions/ source /usr/share/bash-completion/completions/deno.bash else echo "Please move deno.bash to the appropriate bash completions directory" fi ``` ### Configure PowerShell shell completion ```bash deno completions powershell | Out-String | Invoke-Expression ``` ### Configure zsh shell completion First add the following to your `.zshrc` file: ```bash fpath=(~/.zsh/completion $fpath) autoload -U compinit compinit ``` Then run the following commands: ```bash deno completions zsh > _deno mv _deno ~/.zsh/completion/_deno autoload -U compinit && compinit ``` ### Configure fish shell completion ```bash deno completions fish > completions.fish chmod +x ./completions.fish ``` --- # deno coverage > Generate a coverage report for your code URL: https://docs.deno.com/runtime/reference/cli/coverage ## Inclusions and Exclusions By default coverage includes any of your code that exists on the local file system, and it's imports. You can customize the inclusions and exclusions by using the `--include` and `--exclude` options. You can expand the coverage to include files that are not on the local file system by using the `--include` option and customizing the regex pattern. ```bash deno coverage --include="^file:|https:" ``` The default inclusion pattern should be sufficient for most use cases, but you can customize it to be more specific about which files are included in your coverage report. Files that contain `test.js`, `test.ts`, `test.jsx`, or `test.tsx` in their name are excluded by default. This is equivalent to: ```bash deno coverage --exclude="test\.(js|mjs|ts|jsx|tsx)$" ``` This default setting prevents your test code from contributing to your coverage report. For a URL to match it must match the include pattern and not match the exclude pattern. ## Ignoring Code Code can be ignored in generated coverage reports by adding coverage ignore comments. Branches and lines in ignored code will be excluded from the report. Ignored branches and lines do not count as covered lines. Instead, ignored lines of code are treated as empty lines. To ignore an entire file, add a `// deno-coverage-ignore-file` comment at the top of the file. ```ts // deno-coverage-ignore-file // all code in this file is ignored ``` Ignored files will not appear in the coverage report. To ignore a single line, add a `// deno-coverage-ignore` comment on the line above the code you want to ignore. ```ts // deno-coverage-ignore console.log("this line is ignored"); ``` To ignore multiple lines, add a `// deno-coverage-ignore-start` comment above the code you want to ignore and a `// deno-coverage-ignore-stop` comment below. ```ts // deno-coverage-ignore-start if (condition) { console.log("both the branch and lines are ignored"); } // deno-coverage-ignore-stop ``` All code after a `// deno-coverage-ignore-start` comment is ignored until a `// deno-coverage-ignore-stop` is reached. Each `// deno-coverage-ignore-start` comment must be terminated by a `// deno-coverage-ignore-stop` comment, and ignored ranges may not be nested. When these requirements are not met, some lines may be unintentionally included in the coverage report. The `deno coverage` command will log warnings for any invalid comments. ```ts // deno-coverage-ignore-start if (condition) { // deno-coverage-ignore-start - A warning will be logged because the previous // coverage range is unterminated. console.log("this code is ignored"); // deno-coverage-ignore-stop } // deno-coverage-ignore-stop // ... // deno-coverage-ignore-start - This comment will be ignored and a warning will // be logged, because this range is unterminated. console.log("this code is not ignored"); ``` Only white space may precede the coverage directive in a coverage comment. However, any text may trail the directive. ```ts // deno-coverage-ignore Trailing text is allowed. console.log("This line is ignored"); // But leading text isn't. deno-coverage-ignore console.log("This line is not ignored"); ``` Coverage comments must start with `//`. Comments starting with `/*` are not valid coverage comments. ```ts // deno-coverage-ignore console.log("This line is ignored"); /* deno-coverage-ignore */ console.log("This line is not ignored"); ``` ## Output Formats By default we support Deno's own coverage format - but you can also output coverage reports in the [lcov format](https://github.com/linux-test-project/lcov?tab=readme-ov-file) (a standard file format used to describe code coverage data), or in html. ```bash deno coverage --lcov --output=cov.lcov ``` This lcov file can be used with other tools that support the lcov format. ```bash deno coverage --html ``` This will output a coverage report as a html file ## Examples Generate a coverage report from the default coverage profile in your workspace ```bash deno test --coverage deno coverage ``` Generate a coverage report from a coverage profile with a custom name ```bash deno test --coverage=custom_profile_name deno coverage custom_profile_name ``` > Note: You can alternatively set coverage directory by `DENO_COVERAGE_DIR` env > var. > > ``` > DENO_COVERAGE_DIR=custom_profile_name deno test > deno coverage custom_profile_name > ``` Only include coverage that matches a specific pattern - in this case, only include tests from main.ts ```bash deno coverage --include="main.ts" ``` Export test coverage from the default coverage profile to an lcov file ```bash deno test --coverage deno coverage --lcov --output=cov.lcov ``` --- # `deno doc`, documentation generator > Generate documentation from your code URL: https://docs.deno.com/runtime/reference/cli/doc ## Examples `deno doc` followed by a list of one or more source files will print the JSDoc documentation for each of the module's **exported** members. For example, given a file `add.ts` with the contents: ```ts /** * Adds x and y. * @param {number} x * @param {number} y * @returns {number} Sum of x and y */ export function add(x: number, y: number): number { return x + y; } ``` Running the Deno `doc` command, prints the function's JSDoc comment to `stdout`: ```shell deno doc add.ts function add(x: number, y: number): number Adds x and y. @param {number} x @param {number} y @returns {number} Sum of x and y ``` ## Linting You can use `--lint` flag to check for problems in your documentation while it's being generated. `deno doc` will point out three kinds of problems: 1. Error for an exported type from the root module referencing a non-exported type. - Ensures API consumers have access to all the types the API uses. This can be suppressed by exporting the type from a root module (one of the files specified to `deno doc` on the command line) or by marking the type with an `@internal` jsdoc tag. 1. Error for missing return type or property type on a **public** type. - Ensures `deno doc` displays the return/property type and helps improve type checking performance. 1. Error for missing JS doc comment on a **public** type. - Ensures the code is documented. Can be suppressed by adding a jsdoc comment, or via an `@ignore` jsdoc tag to exclude it from the documentation. Alternatively, add an `@internal` tag to keep it in the docs, but signify it's internal. For example: ```ts title="/mod.ts" interface Person { name: string; // ... } export function getName(person: Person) { return person.name; } ``` ```shell $ deno doc --lint mod.ts Type 'getName' references type 'Person' which is not exported from a root module. Missing JS documentation comment. Missing return type. at file:///mod.ts:6:1 ``` These lints are meant to help you write better documentation and speed up type-checking in your projects. If any problems are found, the program exits with non-zero exit code and the output is reported to standard error. ## Supported JSDoc features and tags Deno implements a large set of JSDoc tags, but does not strictly adhere to the JSDoc standard, but rather align with sensible standards and features provided by widely used tools and ecosystems in the same feature-space, like [TSDoc](https://tsdoc.org/) and [TypeDoc](https://typedoc.org/). For any free-form text places, ie the main description of a JSDoc comment, the description of a parameter, etc. accept markdown. ### Supported Tags The following tags are supported, being a selection of tags used and specified by JSDoc, TSDoc and TypeDoc: - [`constructor`/`class`](https://jsdoc.app/tags-class): mark a function to be a constructor. - [`ignore`](https://jsdoc.app/tags-ignore): ignore a symbol to be included in the output. - internal: mark a symbol to be used only for internal. In the HTML generator, the symbol will not get a listed entry, however it will still be generated and can be reached if a non-internal symbol links to it. - [`public`](https://jsdoc.app/tags-public): treat a symbol as public API. Equivalent of TypeScript `public` keyword. - [`private`](https://jsdoc.app/tags-private): treat a symbol as private API. Equivalent of TypeScript `private` keyword. - [`protected`](https://jsdoc.app/tags-protected): treat a property or method as protected API. Equivalent of TypeScript `protected` keyword. - [`readonly`](https://jsdoc.app/tags-readonly): mark a symbol to be readonly, meaning that it cannot be overwritten. - [`experimental`](https://tsdoc.org/pages/tags/experimental): mark a symbol as experimental, meaning that the API might change or be removed, or behaviour is not well-defined. - [`deprecated`](https://jsdoc.app/tags-deprecated): mark a symbol as deprecated, meaning that it is not supported anymore and might be removed in a future version. - [`module`](https://jsdoc.app/tags-module): this tag can be defined on a top-level JSDoc comment, which will treat that comment to be for the file instead of the subsequent symbol. A value can be specified, which will use the value as an identifier for the module (ie for default exports). - `category`/`group`: mark a symbol to be of a specific category/group. This is useful for grouping together various symbols together. - [`see`](https://jsdoc.app/tags-see): define an external reference related to the symbol. - [`example`](https://jsdoc.app/tags-example): define an example for the symbol. Unlike JSDoc, code examples need to be wrapped in triple backtick (markdown-style codeblocks), which aligns more with TSDoc than JSDoc. - `tags`: define additional custom labels for a symbol, via a comma separated list. - [`since`](https://jsdoc.app/tags-since): define since when the symbol has been available. - [`callback`](https://jsdoc.app/tags-callback): define a callback. - [`template`/`typeparam`/`typeParam`](https://tsdoc.org/pages/tags/typeparam): define a generic parameter. - [`prop`/`property`](https://jsdoc.app/tags-property): define a property on a symbol. - [`typedef`](https://jsdoc.app/tags-typedef): define a type. - [`param`/`arg`/`argument`](https://jsdoc.app/tags-param): define a parameter on a function. - [`return`/`returns`](https://jsdoc.app/tags-returns): define the return type and/or comment of a function. - [`throws`/`exception`](https://jsdoc.app/tags-throws): define what a function throws when called. - [`enum`](https://jsdoc.app/tags-enum): define an object to be an enum. - [`extends`/`augments`](https://jsdoc.app/tags-augments): define a type that a function extends on. - [`this`](https://jsdoc.app/tags-this): define what the `this` keyword refers to in a function. - [`type`](https://jsdoc.app/tags-type): define the type of a symbol. - [`default`](https://jsdoc.app/tags-default): define the default value for a variable, property or field. ### Inline Linking Inline links let you specify links to other parts of the page, other symbols, or modules. Besides just supporting markdown-style links, [JSDoc style inline-links](https://jsdoc.app/tags-inline-link) are also supported. For example, you can do`{@link https://docs.deno.com}`, which will be rendered as the following 'https://docs.deno.com'. `{@linkcode https://docs.deno.com}` can also be used, to make it in a monospace font, and will be rendered roughly like this: '`https://docs.deno.com`'. You can also specify a replacement label, via `{@link https://docs.deno.com | Deno Docs}`, which will use the text after `|` as the text to display instead of the link. The previous example would render as '[Deno Docs](https://docs.deno.com)'. You can add inline links in your descriptions to other symbols via `{@link MySymbol}`. For module linking, the same applies, but you use the `{@link [myModule]}` syntax. You can also link to symbols in a different module via `{@link [myModule].mysymbol}`. ## HTML output Use the `--html` flag to generate a static site with documentation. ```console $ deno doc --html --name="My library" ./mod.ts $ deno doc --html --name="My library" --output=./documentation/ ./mod.ts $ deno doc --html --name="My library" ./sub1/mod.ts ./sub2/mod.ts ``` The generated documentation is a static site with multiple pages that can be deployed to any static site hosting service. A client-side search is included in the generated site, but is not available if user's browser has JavaScript disabled. ## JSON output Use the `--json` flag to output the documentation in JSON format. This JSON format is consumed by the [deno doc website](https://github.com/denoland/docland) and is used to generate module documentation. --- # Configuring Deno behavior URL: https://docs.deno.com/runtime/reference/cli/env_variables There are several environment variables which can impact the behavior of Deno: ### DENO_AUTH_TOKENS A list of authorization tokens which can be used to allow Deno to access remote private code. See the [Private modules and repositories](/runtime/reference/private_repositories) section for more details. ### DENO_TLS_CA_STORE A list of certificate stores which will be used when establishing TLS connections. The available stores are `mozilla` and `system`. You can specify one, both or none. Certificate chains attempt to resolve in the same order in which you specify them. The default value is `mozilla`. The `mozilla` store will use the bundled Mozilla certs provided by [`webpki-roots`](https://crates.io/crates/webpki-roots). The `system` store will use your platform's [native certificate store](https://crates.io/crates/rustls-native-certs). The exact set of Mozilla certs will depend on the version of Deno you are using. If you specify no certificate stores, then no trust will be given to any TLS connection without also specifying `DENO_CERT` or `--cert` or specifying a specific certificate per TLS connection. ### DENO_CERT Load a certificate authority from a PEM encoded file. This "overrides" the `--cert` option. See the [Proxies](#proxies) section for more information. ### DENO_DIR this will set the directory where cached information from the CLI is stored. This includes items like cached remote modules, cached transpiled modules, language server cache information and persisted data from local storage. This defaults to the operating system's default cache location and then under the `deno` path. ### DENO_INSTALL_ROOT When using `deno install` where the installed scripts are stored. This defaults to `$HOME/.deno/bin`. ### DENO_NO_PACKAGE_JSON Set to disable auto-resolution of package.json files. ### DENO_NO_PROMPT Set to disable permission prompts on access (alternative to passing `--no-prompt` on invocation). ### DENO_NO_UPDATE_CHECK Set to disable checking if a newer Deno version is available. ### DENO_WEBGPU_TRACE The directory to use for WebGPU traces. ### HTTP_PROXY The proxy address to use for HTTP requests. See the [Proxies](#proxies) section for more information. ### HTTPS_PROXY The proxy address to use for HTTPS requests. See the [Proxies](#proxies) section for more information. ### NO_COLOR If set, this will prevent the Deno CLI from sending ANSI color codes when writing to stdout and stderr. See the website [https://no-color.org](https://no-color.org/) for more information on this _de facto_ standard. The value of this flag can be accessed at runtime without permission to read the environment variables by checking the value of `Deno.noColor`. ### NO_PROXY Indicates hosts which should bypass the proxy set in the other environment variables. See the [Proxies](#proxies) section for more information. ### NPM_CONFIG_REGISTRY The npm registry to use when loading modules via [npm specifiers](/runtime/fundamentals/node/#using-npm-packages) ## Proxies Deno is able to handle network requests through a proxy server, useful for various reasons such as security, caching, or accessing resources behind a firewall. The runtime supports supports proxies for module downloads and the Web standard `fetch` API. Deno reads proxy configuration from environment variables: `HTTP_PROXY`, `HTTPS_PROXY` and `NO_PROXY`. On Windows, if environment variables are not found, Deno falls back to reading proxies from the registry. --- # deno eval > Evaluate JavaScript and TypeScript code in the command line URL: https://docs.deno.com/runtime/reference/cli/eval --- # `deno fmt`, code formatting > Format your code with Deno's built-in formatter URL: https://docs.deno.com/runtime/reference/cli/fmt ## Supported File Types Deno ships with a built-in code formatter that will auto-format the following files: | File Type | Extension | Notes | | -------------------- | ------------------------------------------------------ | -------------------------------------------------------------------------------------- | | JavaScript | `.js`, `.cjs`, `.mjs` | | | TypeScript | `.ts`, `.mts`, `.cts` | | | JSX | `.jsx` | | | TSX | `.tsx` | | | Markdown | `.md`, `.mkd`, `.mkdn`, `.mdwn`, `.mdown`, `.markdown` | | | JSON | `.json` | | | JSONC | `.jsonc` | | | CSS | `.css` | | | HTML | `.html` | | | [Nunjucks][Nunjucks] | `.njk` | | | [Vento][Vento] | `.vto` | | | YAML | `.yml`, `.yaml` | | | Sass | `.sass` | | | SCSS | `.scss` | | | LESS | `.less` | | | Jupyter Notebook | `.ipynb` | | | Astro | `.astro` | Requires `--unstable-component` flag or `"unstable": ["fmt-component"]` config option. | | Svelte | `.svelte` | Requires `--unstable-component` flag or `"unstable": ["fmt-component"]` config option. | | Vue | `.vue` | Requires `--unstable-component` flag or `"unstable": ["fmt-component"]` config option. | | SQL | `.sql` | Requires `--unstable-sql` flag or `"unstable": ["fmt-sql"]` config option. | [Nunjucks]: https://mozilla.github.io/nunjucks/ [Vento]: https://github.com/ventojs/vento :::note **`deno fmt` can format code snippets in Markdown files.** Snippets must be enclosed in triple backticks and have a language attribute. ::: ## Ignoring Code ### JavaScript / TypeScript / JSONC Ignore formatting code by preceding it with a `// deno-fmt-ignore` comment: ```ts // deno-fmt-ignore export const identity = [ 1, 0, 0, 0, 1, 0, 0, 0, 1, ]; ``` Or ignore an entire file by adding a `// deno-fmt-ignore-file` comment at the top of the file. ### Markdown / HTML / CSS Ignore formatting next item by preceding it with `` comment: ```html

    Hello there

    ``` To ignore a section of code, surround the code with `` and `` comments. Or ignore an entire file by adding a `` comment at the top of the file. ### YAML Ignore formatting next item by preceding it with `# deno-fmt-ignore` comment: ```html # deno-fmt-ignore aaaaaa: bbbbbbb ``` ## More about linting and formatting For more information about linting and formating in Deno, and the differences between these two utilities, visit the [Linting and Formatting](/runtime/fundamentals/linting_and_formatting/) page in our Fundamentals section. --- # Deno CLI Subcommands URL: https://docs.deno.com/runtime/reference/cli/ The Deno CLI (Command Line Interface) allows you to interact with the Deno runtime environment from your terminal or command prompt. The CLI has a number of subcommands that can be used to perform different tasks, check the links below for more information on each subcommand. ## Execution - [deno run](/runtime/reference/cli/run/) - run a script - [deno serve](/runtime/reference/cli/serve/) - run a web server - [deno task](/runtime/reference/cli/task/) - run a task - [deno repl](/runtime/reference/cli/repl/) - starts a read-eval-print-loop - [deno eval](/runtime/reference/cli/eval/) - evaluate provided script ## Dependency management - [deno add](/runtime/reference/cli/add) - add dependencies - deno cache - _(Deprecated. Please use [deno install](/runtime/reference/cli/install/))_ - [deno install](/runtime/reference/cli/install/) - install a dependency or a script - [deno uninstall](/runtime/reference/cli/uninstall/) - uninstall a dependency or a script - [deno remove](/runtime/reference/cli/remove) - Remove dependencies - [deno outdated](/runtime/reference/cli/outdated) - view or update outdated dependencies ## Tooling - [deno bench](/runtime/reference/cli/bench/) - benchmarking tool - [deno check](/runtime/reference/cli/check/) - type check your program without running it - [deno compile](/runtime/reference/cli/compile/) - compile a program into a standalone executable - [deno completions](/runtime/reference/cli/completions/) - generate shell completions - [deno coverage](/runtime/reference/cli/coverage/) - generate test coverage reports - [deno doc](/runtime/reference/cli/doc/) - generate documentation for a module - [deno fmt](/runtime/reference/cli/fmt/) - format your code - [deno info](/runtime/reference/cli/info/) - inspect an ES module and all of its dependencies - [deno init](/runtime/reference/cli/init/) - create a new project - [deno jupyter](/runtime/reference/cli/jupyter/) - run a Jupyter notebook - [deno lint](/runtime/reference/cli/lint/) - lint your code - [deno lsp](/runtime/reference/cli/lsp/) - language server protocol integration - [deno publish](/runtime/reference/cli/publish/) - publish a module to JSR - [deno test](/runtime/reference/cli/test/) - run your tests - [deno types](/runtime/reference/cli/types/) - print runtime types - [deno upgrade](/runtime/reference/cli/upgrade/) - upgrade Deno to the latest version ## Other - [Unstable feature flags](/runtime/reference/cli/unstable_flags/) - [Integrating the Deno LSP](/runtime/reference/lsp_integration/) --- # `deno info`, dependency inspector > Inspect the dependencies of your project URL: https://docs.deno.com/runtime/reference/cli/info ## Example ```shell $ deno info jsr:@std/http@1.0.0-rc.5/file-server local: /home/lucacasonato/.cache/deno/deps/https/jsr.io/3a0e5ef03d2090c75c81daf771ed9a73009518adfe688c333dc11d8006dc3598 emit: /home/lucacasonato/.cache/deno/gen/https/jsr.io/3a0e5ef03d2090c75c81daf771ed9a73009518adfe688c333dc11d8006dc3598.js type: TypeScript dependencies: 40 unique size: 326.42KB https://jsr.io/@std/http/1.0.0-rc.5/file_server.ts (24.74KB) ├─┬ https://jsr.io/@std/path/1.0.1/posix/join.ts (862B) │ ├── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts (307B) │ └─┬ https://jsr.io/@std/path/1.0.1/posix/normalize.ts (1.31KB) │ ├─┬ https://jsr.io/@std/path/1.0.1/_common/normalize.ts (263B) │ │ └── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ ├─┬ https://jsr.io/@std/path/1.0.1/_common/normalize_string.ts (2.25KB) │ │ └── https://jsr.io/@std/path/1.0.1/_common/constants.ts (1.97KB) │ └─┬ https://jsr.io/@std/path/1.0.1/posix/_util.ts (391B) │ └── https://jsr.io/@std/path/1.0.1/_common/constants.ts * ├── https://jsr.io/@std/path/1.0.1/posix/normalize.ts * ├─┬ https://jsr.io/@std/path/1.0.1/extname.ts (906B) │ ├── https://jsr.io/@std/path/1.0.1/_os.ts (736B) │ ├─┬ https://jsr.io/@std/path/1.0.1/posix/extname.ts (2.28KB) │ │ ├── https://jsr.io/@std/path/1.0.1/_common/constants.ts * │ │ ├── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ │ └── https://jsr.io/@std/path/1.0.1/posix/_util.ts * │ └─┬ https://jsr.io/@std/path/1.0.1/windows/extname.ts (2.5KB) │ ├── https://jsr.io/@std/path/1.0.1/_common/constants.ts * │ ├── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ └─┬ https://jsr.io/@std/path/1.0.1/windows/_util.ts (828B) │ └── https://jsr.io/@std/path/1.0.1/_common/constants.ts * ├─┬ https://jsr.io/@std/path/1.0.1/join.ts (926B) │ ├── https://jsr.io/@std/path/1.0.1/_os.ts * │ ├── https://jsr.io/@std/path/1.0.1/posix/join.ts * │ └─┬ https://jsr.io/@std/path/1.0.1/windows/join.ts (2.41KB) │ ├── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ ├── https://jsr.io/@std/path/1.0.1/windows/_util.ts * │ └─┬ https://jsr.io/@std/path/1.0.1/windows/normalize.ts (3.84KB) │ ├── https://jsr.io/@std/path/1.0.1/_common/normalize.ts * │ ├── https://jsr.io/@std/path/1.0.1/_common/constants.ts * │ ├── https://jsr.io/@std/path/1.0.1/_common/normalize_string.ts * │ └── https://jsr.io/@std/path/1.0.1/windows/_util.ts * ├─┬ https://jsr.io/@std/path/1.0.1/relative.ts (1.08KB) │ ├── https://jsr.io/@std/path/1.0.1/_os.ts * │ ├─┬ https://jsr.io/@std/path/1.0.1/posix/relative.ts (3.25KB) │ │ ├── https://jsr.io/@std/path/1.0.1/posix/_util.ts * │ │ ├─┬ https://jsr.io/@std/path/1.0.1/posix/resolve.ts (1.84KB) │ │ │ ├── https://jsr.io/@std/path/1.0.1/_common/normalize_string.ts * │ │ │ ├── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ │ │ └── https://jsr.io/@std/path/1.0.1/posix/_util.ts * │ │ └─┬ https://jsr.io/@std/path/1.0.1/_common/relative.ts (287B) │ │ └── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ └─┬ https://jsr.io/@std/path/1.0.1/windows/relative.ts (4.24KB) │ ├── https://jsr.io/@std/path/1.0.1/_common/constants.ts * │ ├─┬ https://jsr.io/@std/path/1.0.1/windows/resolve.ts (5.02KB) │ │ ├── https://jsr.io/@std/path/1.0.1/_common/constants.ts * │ │ ├── https://jsr.io/@std/path/1.0.1/_common/normalize_string.ts * │ │ ├── https://jsr.io/@std/path/1.0.1/_common/assert_path.ts * │ │ └── https://jsr.io/@std/path/1.0.1/windows/_util.ts * │ └── https://jsr.io/@std/path/1.0.1/_common/relative.ts * ├─┬ https://jsr.io/@std/path/1.0.1/resolve.ts (1.02KB) │ ├── https://jsr.io/@std/path/1.0.1/_os.ts * │ ├── https://jsr.io/@std/path/1.0.1/posix/resolve.ts * │ └── https://jsr.io/@std/path/1.0.1/windows/resolve.ts * ├─┬ https://jsr.io/@std/path/1.0.1/constants.ts (705B) │ └── https://jsr.io/@std/path/1.0.1/_os.ts * ├─┬ https://jsr.io/@std/media-types/1.0.2/content_type.ts (3.09KB) │ ├─┬ https://jsr.io/@std/media-types/1.0.2/parse_media_type.ts (3.54KB) │ │ └── https://jsr.io/@std/media-types/1.0.2/_util.ts (3.18KB) │ ├─┬ https://jsr.io/@std/media-types/1.0.2/get_charset.ts (1.45KB) │ │ ├── https://jsr.io/@std/media-types/1.0.2/parse_media_type.ts * │ │ ├── https://jsr.io/@std/media-types/1.0.2/_util.ts * │ │ └─┬ https://jsr.io/@std/media-types/1.0.2/_db.ts (1.34KB) │ │ ├── https://jsr.io/@std/media-types/1.0.2/vendor/db.ts (190.69KB) │ │ └── https://jsr.io/@std/media-types/1.0.2/_util.ts * │ ├─┬ https://jsr.io/@std/media-types/1.0.2/format_media_type.ts (2.45KB) │ │ └── https://jsr.io/@std/media-types/1.0.2/_util.ts * │ ├── https://jsr.io/@std/media-types/1.0.2/_db.ts * │ └─┬ https://jsr.io/@std/media-types/1.0.2/type_by_extension.ts (1.15KB) │ └── https://jsr.io/@std/media-types/1.0.2/_db.ts * ├─┬ https://jsr.io/@std/http/1.0.0-rc.5/etag.ts (6.46KB) │ └─┬ https://jsr.io/@std/encoding/1.0.1/base64.ts (3.18KB) │ └── https://jsr.io/@std/encoding/1.0.1/_validate_binary_like.ts (798B) ├── https://jsr.io/@std/http/1.0.0-rc.5/status.ts (13.39KB) ├── https://jsr.io/@std/streams/1.0.0-rc.4/byte_slice_stream.ts (2.57KB) ├── https://jsr.io/@std/cli/1.0.0/parse_args.ts (21.94KB) ├── https://jsr.io/@std/http/1.0.0-rc.5/deno.json (415B) ├── https://jsr.io/@std/fmt/1.0.0-rc.1/bytes.ts (5.3KB) └── https://jsr.io/@std/net/1.0.0-rc.2/get_network_address.ts (1.68KB) ``` Dependency inspector works with any local or remote ES modules. ## Cache location `deno info` can be used to display information about cache location: ```shell deno info DENO_DIR location: "/Users/deno/Library/Caches/deno" Remote modules cache: "/Users/deno/Library/Caches/deno/deps" TypeScript compiler cache: "/Users/deno/Library/Caches/deno/gen" ``` --- # `deno init`, start a new project > Scaffold a new Deno project with tests and configuration URL: https://docs.deno.com/runtime/reference/cli/init ## Examples ```sh $ deno init ✅ Project initialized Run these commands to get started // Run the program deno run main.ts // Run the program and watch for file changes deno task dev // Run the tests deno test $ deno run main.ts Add 2 + 3 = 5 $ deno test Check file:///dev/main_test.ts running 1 test from main_test.ts addTest ... ok (6ms) ok | 1 passed | 0 failed (29ms) ``` The `init` subcommand will create two files (`main.ts` and `main_test.ts`). These files provide a basic example of how to write a Deno program and how to write tests for it. The `main.ts` file exports a `add` function that adds two numbers together and the `main_test.ts` file contains a test for this function. You can also specify an argument to `deno init` to initialize a project in a specific directory: ```sh $ deno init my_deno_project ✅ Project initialized Run these commands to get started cd my_deno_project // Run the program deno run main.ts // Run the program and watch for file changes deno task dev // Run the tests deno test ``` ## Init a JSR package By running `deno init --lib` Deno will bootstrap a project that is ready to be published on [JSR](https://jsr.io/). ```sh $ deno init --lib ✅ Project initialized Run these commands to get started # Run the tests deno test # Run the tests and watch for file changes deno task dev # Publish to JSR (dry run) deno publish --dry-run ``` Inside `deno.json` you'll see that the entries for `name`, `exports` and `version` are prefilled. ```json { "name": "my-lib", "version": "0.1.0", "exports": "./mod.ts", "tasks": { "dev": "deno test --watch mod.ts" }, "imports": { "@std/assert": "jsr:@std/assert@1" } } ``` ## Initialize a web server Running `deno init --serve` bootstraps a web server that works with [`deno serve`](./serve). ```sh $ deno init --serve ✅ Project initialized Run these commands to get started # Run the server deno serve -R main.ts # Run the server and watch for file changes deno task dev # Run the tests deno -R test ``` Your [`deno.json`](/runtime/fundamentals/configuration/) file will look like this: ```json { "tasks": { "dev": "deno serve --watch -R main.ts" }, "imports": { "@std/assert": "jsr:@std/assert@1", "@std/http": "jsr:@std/http@1" } } ``` Now, you can start your web server, which [watches for changes](/runtime/getting_started/command_line_interface/#watch-mode), by running `deno task dev`. ```sh $ deno task dev Task dev deno serve --watch -R main.ts Watcher Process started. deno serve: Listening on http://0.0.0.0:8000/ ``` ## Generate a library project You can append a `--lib` flag to add extra parameters to your `deno.json`, such as "name", "version" and an "exports" fields. ```sh $ deno init my_deno_project --lib ✅ Project initialized ``` The resulting `deno.json will be as follows: ```jsonc { "name": "my_deno_project", "version": "0.1.0", "exports": "./mod.ts", "tasks": { "dev": "deno test --watch mod.ts" }, "license": "MIT", "imports": { "@std/assert": "jsr:@std/assert@1" } } ``` --- # `deno install` > Install and cache dependencies for your project URL: https://docs.deno.com/runtime/reference/cli/install ## Examples ### deno install Use this command to install all dependencies defined in `deno.json` and/or `package.json`. The dependencies will be installed in the global cache, but if your project has a `package.json` file, a local `node_modules` directory will be set up as well. ### deno install [PACKAGES] Use this command to install particular packages and add them to `deno.json` or `package.json`. ```shell $ deno install jsr:@std/testing npm:express ``` :::tip You can also use `deno add` which is an alias to `deno install [PACKAGES]` ::: If your project has a `package.json` file, the packages coming from npm will be added to `dependencies` in `package.json`. Otherwise all packages will be added to `deno.json`. ### deno install --entrypoint [FILES] Use this command to install all dependencies that are used in the provided files and their dependencies. This is particularly useful if you use `jsr:`, `npm:`, `http:` or `https:` specifiers in your code and want to cache all the dependencies before deploying your project. ```js title="main.js" import * as colors from "jsr:@std/fmt/colors"; import express from "npm:express"; ``` ```shell $ deno install -e main.js Download jsr:@std/fmt Download npm:express ``` :::tip If you want to set up local `node_modules` directory, you can pass `--node-modules-dir=auto` flag. Some dependencies might not work correctly without a local `node_modules` directory. ::: ### deno install --global [PACKAGE_OR_URL] Use this command to install provide package or script as a globally available binary on your system. This command creates a thin, executable shell script which invokes `deno` using the specified CLI flags and main module. It is placed in the installation root. Example: ```shell $ deno install --global --allow-net --allow-read jsr:@std/http/file-server Download jsr:@std/http/file-server... ✅ Successfully installed file-server. /Users/deno/.deno/bin/file-server ``` To change the executable name, use `-n`/`--name`: ```shell deno install -g -N -R -n serve jsr:@std/http/file-server ``` The executable name is inferred by default: - Attempt to take the file stem of the URL path. The above example would become 'file-server'. - If the file stem is something generic like 'main', 'mod', 'index' or 'cli', and the path has no parent, take the file name of the parent path. Otherwise settle with the generic name. - If the resulting name has an '@...' suffix, strip it. To change the installation root, use `--root`: ```shell deno install -g -N -R --root /usr/local/bin jsr:@std/http/file-server ``` The installation root is determined, in order of precedence: - `--root` option - `DENO_INSTALL_ROOT` environment variable - `$HOME/.deno/bin` These must be added to the path manually if required. ```shell echo 'export PATH="$HOME/.deno/bin:$PATH"' >> ~/.bashrc ``` You must specify permissions that will be used to run the script at installation time. ```shell deno install -g -N -R jsr:@std/http/file-server -- -p 8080 ``` The above command creates an executable called `file_server` that runs with network and read permissions and binds to port 8080. For good practice, use the [`import.meta.main`](/runtime/tutorials/module_metadata/) idiom to specify the entry point in an executable script. Example: ```ts // https://example.com/awesome/cli.ts async function myAwesomeCli(): Promise { // -- snip -- } if (import.meta.main) { myAwesomeCli(); } ``` When you create an executable script make sure to let users know by adding an example installation command to your repository: ```shell # Install using deno install $ deno install -n awesome_cli https://example.com/awesome/cli.ts ``` ## Native Node.js addons A lot of popular packages npm packages like [`npm:sqlite3`](https://www.npmjs.com/package/sqlite3) or [`npm:duckdb`](https://www.npmjs.com/package/duckdb) depend on ["lifecycle scripts"](https://docs.npmjs.com/cli/v10/using-npm/scripts#life-cycle-scripts), eg. `preinstall` or `postinstall` scripts. Most often running these scripts is required for a package to work correctly. Unlike npm, Deno does not run these scripts by default as they pose a potential security vulnerability. You can still run these scripts by passing the `--allow-scripts=` flag when running `deno install`: ```shell deno install --allow-scripts=npm:sqlite3 ``` _Install all dependencies and allow `npm:sqlite3` package to run its lifecycle scripts_. ## --quiet flag The `--quiet` flag suppresses diagnostic output when installing dependencies. When used with `deno install`, it will hide progress indicators, download information, and success messages. ```shell $ deno install --quiet jsr:@std/http/file-server ``` This is useful for scripting environments or when you want cleaner output in CI pipelines. ## Uninstall You can uninstall dependencies or binary script with `deno uninstall` command: ```shell $ deno uninstall express Removed express ``` ```shell $ deno uninstall -g file-server deleted /Users/deno/.deno/bin/file-server ✅ Successfully uninstalled file-server ``` See [`deno uninstall` page for more details](/runtime/reference/cli/uninstall/). --- # Jupyter Kernel for Deno > Write JavaScript and TypeScript in Jupyter notebooks thanks to Deno's built-in Jupyter kernel URL: https://docs.deno.com/runtime/reference/cli/jupyter Deno ships with a built-in Jupyter kernel that allows you to write JavaScript and TypeScript; use Web and Deno APIs and import `npm` packages straight in your interactive notebooks. :::caution `deno jupyter` always runs with `--allow-all` Currently all code executed in the Jupyter kernel runs with `--allow-all` flag. This is a temporary limitation and will be addressed in the future. ::: ## Quickstart Run `deno jupyter --unstable` and follow the instructions. You can run `deno jupyter --unstable --install` to force installation of the kernel. Deno assumes that `jupyter` command is available in your `PATH`. After completing the installation process, the Deno kernel will be available in the notebook creation dialog in JupyterLab and the classic notebook: ![Jupyter notebook kernel selection](../images/jupyter_notebook.png) You can use the Deno Jupyter kernel in any editor that supports Jupyter notebooks. ### VS Code - Install the [VSCode Jupyter extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.jupyter) - Open or create a notebook file by opening the Command Palette (Ctrl+Shift+P) and selecting "Create: New Jupyter Notebook". This can be done manually by creating a file with the ".ipynb" file extension. - When on a new or existing Notebook, click creating a new Jupyter Notebook select "Jupyter kernels" and then select Deno ![Selecting Deno in VS Code](https://github.com/denoland/deno-docs/assets/836375/32f0ccc3-35f7-47e5-84f4-17c20a5b5732) ### JetBrains IDEs Jupyter Notebooks are available right out of the box. ## Rich content output `Deno.jupyter` namespaces provides helper function for displaying rich content in your notebooks [using MIME types that Jupyter supports](https://docs.jupyter.org/en/latest/reference/mimetype.html). --- The easiest way to provide a rich output is to return an object that that has a `[Symbol.for("Jupyter.display")]` method. This method should return a dictionary mapping a MIME type to a value that should be displayed. ```ts { [Symbol.for("Jupyter.display")]() { return { // Plain text content "text/plain": "Hello world!", // HTML output "text/html": "

    Hello world!

    ", } } } ``` _Example of an object that returns plain text and HTML output._ :::info You can also use `Deno.jupyter.$display` instead of typing `Symbol.for("Jupyter.display")` ::: This is a regular function, so you you can use any library you want to format the output - eg. use `@std/fmt/colors` to provide a colorful output: ```ts import * as colors from "jsr:@std/fmt/colors"; { [Deno.jupyter.$display]() { return { "text/plain": colors.green("Hello world"), } } } ``` You can also use `Deno.jupyter.display` function to directly display the MIME bundle: ```js await Deno.jupyter.display({ "text/plain": "Hello, world!", "text/html": "

    Hello, world!

    ", "text/markdown": "# Hello, world!", }, { raw: true }); ``` ![`Deno.jupyter.display` API example](../images/jupyter-display.png) Your notebook frontend will automatically select the "richest" MIME type to display based on its capabilities. --- `Deno.jupyter` provides several helper methods for rich output of common media types. `Deno.jupyter.html` is a tagged template that will render the provided string as an HTML in the notebook. ```js Deno.jupyter.html`

    Hello, world!

    From Deno kernel

    Lorem ipsum dolor sit amet

    `; ``` ![`Deno.jupyter.html` API example](../images/jupyter-html.png) `Deno.jupyter.md` is a tagged template that will render provided string as a Markdown document in the notebook. ```js Deno.jupyter .md`# Notebooks in TypeScript via Deno ![Deno logo](https://github.com/denoland.png?size=32) **Interactive compute with Jupyter _built into Deno_!**`; ``` ![`Deno.jupyter.md` API example](../images/jupyter-md.png) `Deno.jupyter.svg` is a tagged template that will render provided string as an SVG figure in the notebook. ```js Deno.jupyter.svg` `; ``` ![`Deno.jupyter.svg` API example](../images/jupyter-svg.png) `Deno.jupyter.image` is function that will render a JPG or PNG image. You can pass a filepath, or already read bytes: ```js Deno.jupyter.image("./cat.jpg"); const data = Deno.readFileSync("./dog.png"); Deno.jupyter.image(data); ``` ## prompt and confirm APIs You can use `prompt` and `confirm` Web APIs to wait for user input in your notebook.
    confirm and prompt APIs example
    ## IO pub channel broadcasting `Deno.jupyter.broadcast` allows to publish messages to the IO pub channel allowing to provide live updates as the cell is evaluated. Consider this example that prints a message before we start a computation and another when the computation is finished: ```js await Deno.jupyter.broadcast("display_data", { data: { "text/html": "Processing..." }, metadata: {}, transient: { display_id: "progress" }, }); // Pretend we're doing an expensive compute await new Promise((resolve) => setTimeout(resolve, 1500)); await Deno.jupyter.broadcast("update_display_data", { data: { "text/html": "Done" }, metadata: {}, transient: { display_id: "progress" }, }); ```
    Deno.jupyter.broadcast API example
    ## Examples Here's an example of using `@observablehq/plot` to generate a chart: ```ts import { document, penguins } from "jsr:@ry/jupyter-helper"; import * as Plot from "npm:@observablehq/plot"; let p = await penguins(); Plot.plot({ marks: [ Plot.dot(p.toRecords(), { x: "culmen_depth_mm", y: "culmen_length_mm", fill: "species", }), ], document, }); ``` ![Example plot generated using `@observablehq/plot` library](../images/jupyter-plot.png) See https://github.com/rgbkrk/denotebooks for more advanced examples leveraging data analysis and visualisation libraries like Polars, Observable and d3. ## `jupyter console` integration You can also use Deno Jupyter kernel in the `jupyter console` REPL. To do that, you should launch your console with `jupyter console --kernel deno`. ![Using the Deno kernel in a CLI](../images/jupyter-cli.gif) --- # `deno lint`, linter > Run the Deno linter to check your code for errors and apply automated fixes URL: https://docs.deno.com/runtime/reference/cli/lint ## Available rules For a complete list of supported rules, visit [List of rules](/lint/) documentation page. ## Ignore directives ### File level To ignore a whole file use `// deno-lint-ignore-file` at the top of the file: ```ts // deno-lint-ignore-file function foo(): any { // ... } ``` You can also specify the reason for ignoring the file: ```ts // deno-lint-ignore-file -- reason for ignoring function foo(): any { // ... } ``` The ignore directive must be placed before the first statement or declaration: ```ts // Copyright 2018-2024 the Deno authors. All rights reserved. MIT license. /** * Some JS doc */ // deno-lint-ignore-file import { bar } from "./bar.js"; function foo(): any { // ... } ``` You can also ignore certain diagnostics in the whole file: ```ts // deno-lint-ignore-file no-explicit-any no-empty function foo(): any { // ... } ``` If there are multiple `// deno-lint-ignore-file` directives, all but the first one are ignored: ```ts // This is effective // deno-lint-ignore-file no-explicit-any no-empty // But this is NOT effective // deno-lint-ignore-file no-debugger function foo(): any { debugger; // not ignored! } ``` ### Line level To ignore specific diagnostics use `// deno-lint-ignore ` on the preceding line of the offending line. ```ts // deno-lint-ignore no-explicit-any function foo(): any { // ... } // deno-lint-ignore no-explicit-any explicit-function-return-type function bar(a: any) { // ... } ``` You must specify the names of the rules to be ignored. You can also specify the reason for ignoring the diagnostic: ```ts // deno-lint-ignore no-explicit-any -- reason for ignoring function foo(): any { // ... } ``` ## Ignore `ban-unused-ignore` itself `deno lint` provides [`ban-unused-ignore` rule](/lint/rules/ban-unused-ignore/), which will detect ignore directives that don't ever suppress certain diagnostics. This is useful when you want to discover ignore directives that are no longer necessary after refactoring the code. In a few cases, however, you might want to ignore `ban-unused-ignore` rule itself. One of the typical cases would be when working with auto-generated files; it makes sense to add file-level ignore directives for some rules, and there's almost no need for detecting unused directives via `ban-unused-ignore` in this case. You can use `// deno-lint-ignore-file ban-unused-ignore` as always if you want to suppress the rule for a whole file: ```ts // deno-lint-ignore-file ban-unused-ignore no-explicit-any // `no-explicit-any` isn't used but you'll get no diagnostics because of ignoring // `ban-unused-ignore` console.log(42); ``` Do note that ignoring `ban-unused-ignore` itself only works via file-level ignore directives. This means that per line directives, like `// deno-lint-ignore ban-unused-ignore`, don't work at all. If you want to ignore `ban-unused-ignore` for some special reasons, make sure to add it as a file-level ignore directive. ## More about linting and formatting For more information about linting and formating in Deno, and the differences between these two utilities, visit the [Linting and Formatting](/runtime/fundamentals/linting_and_formatting/) page in our Fundamentals section. --- # deno lsp URL: https://docs.deno.com/runtime/reference/cli/lsp :::info Usually humans do not use this subcommand directly. The 'deno lsp' can provide IDEs with go-to-definition support and automatic code formatting. ::: Starts the Deno language server. The language server is used by editors to provide features like intellisense, code formatting, and more. Read more about [integrating with the Deno LSP](/runtime/reference/lsp_integration/). ## Description The 'deno lsp' subcommand provides a way for code editors and IDEs to interact with Deno using the Language Server Protocol. Read more about [how to connect editors and IDEs to `deno lsp`](https://deno.land/manual@v1.42.4/getting_started/setup_your_environment#editors-and-ides). --- # deno outdated > Check for outdated dependencies in your project and safely update them with an interactive CLI URL: https://docs.deno.com/runtime/reference/cli/outdated ## Checking for outdated dependencies The `outdated` subcommand checks for new versions of NPM and JSR dependencies listed in `deno.json` or `package.json` files, and displays dependencies that could be updated. Workspaces are fully supported, including workspaces where some members use `package.json` and others use `deno.json`. For example, take a project with a `deno.json` file: ```json { "imports": { "@std/fmt": "jsr:@std/fmt@^1.0.0", "@std/async": "jsr:@std/async@1.0.1", "chalk": "npm:chalk@4" } } ``` and a lockfile that has `@std/fmt` at version `1.0.0`. ```bash $ deno outdated ┌────────────────┬─────────┬────────┬────────┐ │ Package │ Current │ Update │ Latest │ ├────────────────┼─────────┼────────┼────────┤ │ jsr:@std/fmt │ 1.0.0 │ 1.0.3 │ 1.0.3 │ ├────────────────┼─────────┼────────┼────────┤ │ jsr:@std/async │ 1.0.1 │ 1.0.1 │ 1.0.8 │ ├────────────────┼─────────┼────────┼────────┤ │ npm:chalk │ 4.1.2 │ 4.1.2 │ 5.3.0 │ └────────────────┴─────────┴────────┴────────┘ ``` The `Update` column lists the newest semver-compatible version, while the `Latest` column lists the latest version. Notice that `jsr:@std/async` is listed, even though there is no semver-compatible version to update to. If you would prefer to only show packages that have new compatible versions you can pass the `--compatible` flag. ```bash $ deno outdated --compatible ┌────────────────┬─────────┬────────┬────────┐ │ Package │ Current │ Update │ Latest │ ├────────────────┼─────────┼────────┼────────┤ │ jsr:@std/fmt │ 1.0.0 │ 1.0.3 │ 1.0.3 │ └────────────────┴─────────┴────────┴────────┘ ``` `jsr:@std/fmt` is still listed, since it could be compatibly updated to `1.0.3`, but `jsr:@std/async` is no longer shown. ## Updating dependencies The `outdated` subcommand can also update dependencies with the `--update` flag. By default, it will only update dependencies to semver-compatible versions (i.e. it won't update to a breaking version). ```bash $ deno outdated --update Updated 1 dependency: - jsr:@std/fmt 1.0.0 -> 1.0.3 ``` To update to the latest versions (regardless of whether it's semver compatible), pass the `--latest` flag. ```bash $ deno outdated --update --latest Updated 3 dependencies: - jsr:@std/async 1.0.1 -> 1.0.8 - jsr:@std/fmt 1.0.0 -> 1.0.3 - npm:chalk 4.1.2 -> 5.3.0 ``` ## Selecting packages The `outdated` subcommand also supports selecting which packages to operate on. This works with or without the `--update flag. ```bash $ deno outdated --update --latest chalk Updated 1 dependency: - npm:chalk 4.1.2 -> 5.3.0 ``` Multiple selectors can be passed, and wildcards (`*`) or exclusions (`!`) are also supported. For instance, to update all packages with the `@std` scope, except for `@std/fmt`: ```bash $ deno outdated --update --latest "@std/*" "!@std/fmt" Updated 1 dependency: - jsr:@std/async 1.0.1 -> 1.0.8 ``` Note that if you use wildcards, you will probably need to surround the argument in quotes to prevent the shell from trying to expand them. ### Updating to specific versions In addition to selecting packages to update, the `--update` flag also supports selecting the new _version_ specifying the version after `@`. ```bash ❯ deno outdated --update chalk@5.2 @std/async@1.0.6 Updated 2 dependencies: - jsr:@std/async 1.0.1 -> 1.0.6 - npm:chalk 4.1.2 -> 5.2.0 ``` ## Workspaces In a workspace setting, by default `outdated` will only operate on the _current_ workspace member. For instance, given a workspace: ```json { "workspace": ["./member-a", "./member-b"] } ``` Running ```bash deno outdated ``` from the `./member-a` directory will only check for outdated dependencies listed in `./member-a/deno.json` or `./member-a/package.json`. To include all workspace members, pass the `--recursive` flag (the `-r` shorthand is also accepted) ```bash deno outdated --recursive deno outdated --update --latest -r ``` --- # deno publish > Publish your package or workspace to the JSR registry URL: https://docs.deno.com/runtime/reference/cli/publish ## Package Requirements Your package must have a `name` and `version` and an `exports` field in its `deno.json` or `jsr.json` file. - The `name` field must be unique and follow the `@/` convention. - The `version` field must be a valid semver version. - The `exports` field must point to the main entry point of the package. The exports field can either be specified as a single string, or as an object mapping entrypoint names to paths in your package. Example: ```json title="deno.json" { "name": "@scope_name/package_name", "version": "1.0.0", "exports": "./main.ts" } ``` Before you publish your package, you must create it in the registry by visiting [JSR - Publish a package](https://jsr.io/new). ## Examples Publish your current workspace ```bash deno publish ``` Publish your current workspace with a specific token, bypassing interactive authentication ```bash deno publish --token c00921b1-0d4f-4d18-b8c8-ac98227f9275 ``` Publish and check for errors in remote modules ```bash deno publish --check=all ``` Perform a dry run to simulate publishing. ```bash deno publish --dry-run ``` Publish using settings from a specific configuration file ```bash deno publish --config custom-config.json ``` --- # deno remove > Remove a dependency from your project URL: https://docs.deno.com/runtime/reference/cli/remove --- # `deno repl`, interactive scripting prompt > Interact with Deno's runtime in a REPL environment URL: https://docs.deno.com/runtime/reference/cli/repl ## Special variables The REPL provides a couple of special variables, that are always available: | Identifier | Description | | ---------- | ------------------------------------ | | _ | Yields the last evaluated expression | | _error | Yields the last thrown error | ```console Deno 1.14.3 exit using ctrl+d or close() > "hello world!" "hello world!" > _ "hello world!" > const foo = "bar"; undefined > _ undefined ``` ## Special functions The REPL provides several functions in the global scope: | Function | Description | | -------- | --------------------------------- | | clear() | Clears the entire terminal screen | | close() | Close the current REPL session | ## `--eval` flag `--eval` flag allows you to run some code in the runtime before you are dropped into the REPL. This is useful for importing some code you commonly use in the REPL, or modifying the runtime in some way: ```console $ deno repl --allow-net --eval 'import { assert } from "jsr:@std/assert@1"' Deno 1.45.3 exit using ctrl+d, ctrl+c, or close() > assert(true) undefined > assert(false) Uncaught AssertionError at assert (https://jsr.io/@std/assert/1.0.0/assert.ts:21:11) at :1:22 ``` ## `--eval-file` flag `--eval-file` flag allows you to run code from specified files before you are dropped into the REPL. Like the `--eval` flag, this is useful for importing code you commonly use in the REPL, or modifying the runtime in some way. Files can be specified as paths or URLs. URL files are cached and can be reloaded via the `--reload` flag. If `--eval` is also specified, then `--eval-file` files are run before the `--eval` code. ```console $ deno repl --eval-file=https://docs.deno.com/examples/welcome.ts,https://docs.deno.com/examples/local.ts Download https://docs.deno.com/examples/welcome.ts Welcome to Deno! Download https://docs.deno.com/examples/local.ts Deno 1.45.3 exit using ctrl+d or close() > local // this variable is defined locally in local.ts, but not exported "This is a local variable inside of local.ts" ``` ### Relative Import Path Resolution If `--eval-file` specifies a code file that contains relative imports, then the runtime will try to resolve the imports relative to the current working directory. It will not try to resolve them relative to the code file's location. This can cause "Module not found" errors when `--eval-file` is used with module files: ```console $ deno repl --eval-file=https://jsr.io/@std/encoding/1.0.0/ascii85.ts error in --eval-file file https://jsr.io/@std/encoding/1.0.0/ascii85.ts. Uncaught TypeError: Module not found "file:///home/_validate_binary_like.ts". at async :2:13 Deno 1.45.3 exit using ctrl+d or close() > ``` ## Tab completions Tab completions are crucial feature for quick navigation in REPL. After hitting `tab` key, Deno will now show a list of all possible completions. ```console $ deno repl Deno 1.45.3 exit using ctrl+d or close() > Deno.read readTextFile readFile readDirSync readLinkSync readAll read readTextFileSync readFileSync readDir readLink readAllSync readSync ``` ## Keyboard shortcuts | Keystroke | Action | | --------------------- | ------------------------------------------------------------------------------------------------ | | Ctrl-A, Home | Move cursor to the beginning of line | | Ctrl-B, Left | Move cursor one character left | | Ctrl-C | Interrupt and cancel the current edit | | Ctrl-D | If line _is_ empty, signal end of line | | Ctrl-D, Del | If line is _not_ empty, delete character under cursor | | Ctrl-E, End | Move cursor to end of line | | Ctrl-F, Right | Move cursor one character right | | Ctrl-H, Backspace | Delete character before cursor | | Ctrl-I, Tab | Next completion | | Ctrl-J, Ctrl-M, Enter | Finish the line entry | | Ctrl-K | Delete from cursor to end of line | | Ctrl-L | Clear screen | | Ctrl-N, Down | Next match from history | | Ctrl-P, Up | Previous match from history | | Ctrl-R | Reverse Search history (Ctrl-S forward, Ctrl-G cancel) | | Ctrl-T | Transpose previous character with current character | | Ctrl-U | Delete from start of line to cursor | | Ctrl-V | Insert any special character without performing its associated action | | Ctrl-W | Delete word leading up to cursor (using white space as a word boundary) | | Ctrl-X Ctrl-U | Undo | | Ctrl-Y | Paste from Yank buffer | | Ctrl-Y | Paste from Yank buffer (Meta-Y to paste next yank instead) | | Ctrl-Z | Suspend (Unix only) | | Ctrl-_ | Undo | | Meta-0, 1, ..., - | Specify the digit to the argument. `–` starts a negative argument. | | Meta < | Move to first entry in history | | Meta > | Move to last entry in history | | Meta-B, Alt-Left | Move cursor to previous word | | Meta-Backspace | Kill from the start of the current word, or, if between words, to the start of the previous word | | Meta-C | Capitalize the current word | | Meta-D | Delete forwards one word | | Meta-F, Alt-Right | Move cursor to next word | | Meta-L | Lower-case the next word | | Meta-T | Transpose words | | Meta-U | Upper-case the next word | | Meta-Y | See Ctrl-Y | | Ctrl-S | Insert a new line | ## `DENO_REPL_HISTORY` By default, Deno stores REPL history in a `deno_history.txt` file within the `DENO_DIR` directory. The location of your `DENO_DIR` directory and other resources, can be found by running the `deno info`. You can use `DENO_REPL_HISTORY` environmental variable to control where Deno stores the REPL history file. You can set it to an empty value, Deno will not store the history file. --- # `deno run`, run a file > Run a JavaScript or TypeScript program from a file or URL with Deno's runtime URL: https://docs.deno.com/runtime/reference/cli/run ## Usage To run [this file](https://docs.deno.com/examples/scripts/hello_world.ts) use: ```console deno run https://docs.deno.com/examples/scripts/hello_world.ts ``` You can also run files locally. Ensure that you are in the correct directory and use: ```console deno run hello-world.ts ``` By default, Deno runs programs in a sandbox without access to disk, network or ability to spawn subprocesses. This is because the Deno runtime is [secure by default](/runtime/fundamentals/security/). You can grant or deny required permissions using the [`--allow-*` and `--deny-*` flags](/runtime/fundamentals/security/#permissions-list). ### Permissions examples Grant permission to read from disk and listen to network: ```console deno run --allow-read --allow-net server.ts ``` Grant permission to read allow-listed files from disk: ```console deno run --allow-read=/etc server.ts ``` Grant all permissions _this is not recommended and should only be used for testing_: ```console deno run -A server.ts ``` If your project requires multiple security flags you should consider using a [`deno task`](/runtime/reference/cli/task/) to execute them. ## Watch To watch for file changes and restart process automatically use the `--watch` flag. Deno's built in application watcher will restart your application as soon as files are changed. _Be sure to put the flag before the file name_ eg: ```console deno run --allow-net --watch server.ts ``` Deno's watcher will notify you of changes in the console, and will warn in the console if there are errors while you work. ## Running a package.json script `package.json` scripts can be executed with the [`deno task`](/runtime/reference/cli/task/) command. ## Running code from stdin You can pipe code from stdin and run it immediately with: ```console curl https://docs.deno.com/examples/scripts/hello_world.ts | deno run - ``` ## Terminate run To stop the run command use `ctrl + c`. --- # deno serve > A flexible and configurable HTTP server for Deno URL: https://docs.deno.com/runtime/reference/cli/serve ## Example Here's an example of how you can create a simple HTTP server with declarative fetch: ```typescript title="server.ts" export default { async fetch(_req) { return new Response("Hello world!"); }, } satisfies Deno.ServeDefaultExport; ``` The `satisfies Deno.ServeDefaultExport` type assertion ensures that your exported object conforms to the expected interface for Deno's HTTP server. This provides type safety and better editor autocomplete while allowing you to maintain the inferred types of your implementation. You can then run the server using the `deno serve` command: ```bash deno serve server.ts ``` The logic inside the `fetch` function can be customized to handle different types of requests and serve content accordingly: ```typescript title="server.ts" export default { async fetch(request) { if (request.url.endsWith("/json")) { return Response.json({ hello: "world" }); } return new Response("Hello world!"); }, } satisfies Deno.ServeDefaultExport; ``` --- # `deno task` > A configurable task runner for Deno URL: https://docs.deno.com/runtime/reference/cli/task ## Description `deno task` provides a cross-platform way to define and execute custom commands specific to a codebase. To get started, define your commands in your codebase's [Deno configuration file](/runtime/fundamentals/configuration/) under a `"tasks"` key. For example: ```jsonc { "tasks": { "data": "deno task collect && deno task analyze", "collect": "deno run --allow-read=. --allow-write=. scripts/collect.js", "analyze": { "description": "Run analysis script", "command": "deno run --allow-read=. scripts/analyze.js" } } } ``` ## Specifying the current working directory By default, `deno task` executes commands with the directory of the Deno configuration file (ex. _deno.json_) as the current working directory. This allows tasks to use relative paths and continue to work regardless of where in the directory tree you happen to execute the deno task from. In some scenarios, this may not be desired and this behavior can be overridden with the `INIT_CWD` environment variable. `INIT_CWD` will be set with the full path to the directory the task was run in, if not already set. This aligns with the same behavior as `npm run`. For example, the following task will change the current working directory of the task to be in the same directory the user ran the task from and then output the current working directory which is now that directory (remember, this works on Windows too because `deno task` is cross-platform). ```json { "tasks": { "my_task": "cd $INIT_CWD && pwd" } } ``` ## Getting directory `deno task` was run from Since tasks are run using the directory of the Deno configuration file as the current working directory, it may be useful to know the directory the `deno task` was executed from instead. This is possible by using the `INIT_CWD` environment variable in a task or script launched from `deno task` (works the same way as in `npm run`, but in a cross-platform way). For example, to provide this directory to a script in a task, do the following (note the directory is surrounded in double quotes to keep it as a single argument in case it contains spaces): ```json { "tasks": { "start": "deno run main.ts \"$INIT_CWD\"" } } ``` ## Wildcard matching of tasks The `deno task` command can run multiple tasks in parallel by passing a wildcard pattern. A wildcard pattern is specified with the `*` character. ```json title="deno.json" { "tasks": { "build-client": "deno run -RW client/build.ts", "build-server": "deno run -RW server/build.ts" } } ``` Running `deno task "build-*"` will run both `build-client` and `build-server` tasks. :::note **When using a wildcard** make sure to quote the task name (eg. `"build-*"`), otherwise your shell might try to expand the wildcard character, leading to surprising errors. ::: ## Task dependencies You can specify dependencies for a task: ```json title="deno.json" { "tasks": { "build": "deno run -RW build.ts", "generate": "deno run -RW generate.ts", "serve": { "command": "deno run -RN server.ts", "dependencies": ["build", "generate"] } } } ``` In the above example, running `deno task serve` will first execute `build` and `generate` tasks in parallel, and once both of them finish successfully the `serve` task will be executed: ```bash $ deno task serve Task build deno run -RW build.ts Task generate deno run -RW generate.ts Generating data... Starting the build... Build finished Data generated Task serve deno run -RN server.ts Listening on http://localhost:8000/ ``` Dependency tasks are executed in parallel, with the default parallel limit being equal to number of cores on your machine. To change this limit, use the `DENO_JOBS` environmental variable. Dependencies are tracked and if multiple tasks depend on the same task, that task will only be run once: ```jsonc title="deno.json" { // a // / \ // b c // \ / // d "tasks": { "a": { "command": "deno run a.js", "dependencies": ["b", "c"] }, "b": { "command": "deno run b.js", "dependencies": ["d"] }, "c": { "command": "deno run c.js", "dependencies": ["d"] }, "d": "deno run d.js" } } ``` ```bash $ deno task a Task d deno run d.js Running d Task c deno run c.js Running c Task b deno run b.js Running b Task a deno run a.js Running a ``` If a cycle between dependencies is discovered, an error will be returned: ```jsonc title="deno.json" { "tasks": { "a": { "command": "deno run a.js", "dependencies": ["b"] }, "b": { "command": "deno run b.js", "dependencies": ["a"] } } } ``` ```bash $ deno task a Task cycle detected: a -> b -> a ``` You can also specify a task that has `dependencies` but no `command`. This is useful to logically group several tasks together: ```json title="deno.json" { "tasks": { "dev-client": "deno run --watch client/mod.ts", "dev-server": "deno run --watch sever/mod.ts", "dev": { "dependencies": ["dev-client", "dev-server"] } } } ``` Running `deno task dev` will run both `dev-client` and `dev-server` in parallel. ## Node and npx binary support By default, `deno task` will execute commands with the `deno` binary. If you need to ensure that a command is run with the `npm` or `npx` binary, you can do so by invoking the `npm` or `npx` `run` command respectively. For example: ```json { "tasks": { "test:node": "npm run test" } } ``` ## Workspace support `deno task` can be used in workspaces, to run tasks from multiple member directories in parallel. To execute `dev` tasks from all workspace members use `--recursive` flag: ```jsonc title="deno.json" { "workspace": [ "client", "server" ] } ``` ```jsonc title="client/deno.json" { "name": "@scope/client", "tasks": { "dev": "deno run -RN build.ts" } } ``` ```jsonc title="server/deno.json" { "name": "@scope/server", "tasks": { "dev": "deno run -RN server.ts" } } ``` ```bash $ deno task --recursive dev Task dev deno run -RN build.ts Task dev deno run -RN server.ts Bundling project... Listening on http://localhost:8000/ Project bundled ``` Tasks to run can be filtered based on the workspace members: ```bash $ deno task --filter "client" dev Task dev deno run -RN build.ts Bundling project... Project bundled ``` Note that the filter matches against the workspace member names as specified in the `name` field of each member's `deno.json` file. ## Syntax `deno task` uses a cross-platform shell that's a subset of sh/bash to execute defined tasks. ### Boolean lists Boolean lists provide a way to execute additional commands based on the exit code of the initial command. They separate commands using the `&&` and `||` operators. The `&&` operator provides a way to execute a command and if it _succeeds_ (has an exit code of `0`) it will execute the next command: ```sh deno run --allow-read=. --allow-write=. collect.ts && deno run --allow-read=. analyze.ts ``` The `||` operator is the opposite. It provides a way to execute a command and only if it _fails_ (has a non-zero exit code) it will execute the next command: ```sh deno run --allow-read=. --allow-write=. collect.ts || deno run play_sad_music.ts ``` ### Sequential lists Sequential lists are similar to boolean lists, but execute regardless of whether the previous command in the list passed or failed. Commands are separated with a semi-colon (`;`). ```sh deno run output_data.ts ; deno run --allow-net server.ts ``` ### Async commands Async commands provide a way to make a command execute asynchronously. This can be useful when starting multiple processes. To make a command asynchronous, add an `&` to the end of it. For example the following would execute `sleep 1 && deno run --allow-net server.ts` and `deno run --allow-net client.ts` at the same time: ```sh sleep 1 && deno run --allow-net server.ts & deno run --allow-net client.ts ``` Unlike in most shells, the first async command to fail will cause all the other commands to fail immediately. In the example above, this would mean that if the server command fails then the client command will also fail and exit. You can opt out of this behavior by adding `|| true` to the end of a command, which will force a `0` exit code. For example: ```sh deno run --allow-net server.ts || true & deno run --allow-net client.ts || true ``` ### Environment variables Environment variables are defined like the following: ```sh export VAR_NAME=value ``` Here's an example of using one in a task with shell variable substitution and then with it being exported as part of the environment of the spawned Deno process (note that in the JSON configuration file the double quotes would need to be escaped with backslashes): ```sh export VAR=hello && echo $VAR && deno eval "console.log('Deno: ' + Deno.env.get('VAR'))" ``` Would output: ```console hello Deno: hello ``` #### Setting environment variables for a command To specify environment variable(s) before a command, list them like so: ```console VAR=hello VAR2=bye deno run main.ts ``` This will use those environment variables specifically for the following command. ### Shell variables Shell variables are similar to environment variables, but won't be exported to spawned commands. They are defined with the following syntax: ```sh VAR_NAME=value ``` If we use a shell variable instead of an environment variable in a similar example to what's shown in the previous "Environment variables" section: ```sh VAR=hello && echo $VAR && deno eval "console.log('Deno: ' + Deno.env.get('VAR'))" ``` We will get the following output: ```console hello Deno: undefined ``` Shell variables can be useful when we want to re-use a value, but don't want it available in any spawned processes. ### Exit status variable The exit code of the previously run command is available in the `$?` variable. ```sh # outputs 10 deno eval 'Deno.exit(10)' || echo $? ``` ### Pipelines Pipelines provide a way to pipe the output of one command to another. The following command pipes the stdout output "Hello" to the stdin of the spawned Deno process: ```sh echo Hello | deno run main.ts ``` To pipe stdout and stderr, use `|&` instead: ```sh deno eval 'console.log(1); console.error(2);' |& deno run main.ts ``` ### Command substitution The `$(command)` syntax provides a way to use the output of a command in other commands that get executed. For example, to provide the output of getting the latest git revision to another command you could do the following: ```sh deno run main.ts $(git rev-parse HEAD) ``` Another example using a shell variable: ```sh REV=$(git rev-parse HEAD) && deno run main.ts $REV && echo $REV ``` ### Negate exit code To negate the exit code, add an exclamation point and space before a command: ```sh # change the exit code from 1 to 0 ! deno eval 'Deno.exit(1);' ``` ### Redirects Redirects provide a way to pipe stdout and/or stderr to a file. For example, the following redirects _stdout_ of `deno run main.ts` to a file called `file.txt` on the file system: ```sh deno run main.ts > file.txt ``` To instead redirect _stderr_, use `2>`: ```sh deno run main.ts 2> file.txt ``` To redirect both stdout _and_ stderr, use `&>`: ```sh deno run main.ts &> file.txt ``` To append to a file, instead of overwriting an existing one, use two right angle brackets instead of one: ```sh deno run main.ts >> file.txt ``` Suppressing either stdout, stderr, or both of a command is possible by redirecting to `/dev/null`. This works in a cross-platform way including on Windows. ```sh # suppress stdout deno run main.ts > /dev/null # suppress stderr deno run main.ts 2> /dev/null # suppress both stdout and stderr deno run main.ts &> /dev/null ``` Or redirecting stdout to stderr and vice-versa: ```sh # redirect stdout to stderr deno run main.ts >&2 # redirect stderr to stdout deno run main.ts 2>&1 ``` Input redirects are also supported: ```sh # redirect file.txt to the stdin of gzip gzip < file.txt ``` Note that redirecting multiple redirects is currently not supported. ### Cross-platform shebang Starting in Deno 1.42, `deno task` will execute scripts that start with `#!/usr/bin/env -S` the same way on all platforms. For example: ```ts title="script.ts" #!/usr/bin/env -S deno run console.log("Hello there!"); ``` ```json title="deno.json" { "tasks": { "hi": "./script.ts" } } ``` Then on a Windows machine: ```sh > pwd C:\Users\david\dev\my_project > deno task hi Hello there! ``` ### Glob expansion Glob expansion is supported in Deno 1.34 and above. This allows for specifying globs to match files in a cross-platform way. ```console # match .ts files in the current and descendant directories echo **/*.ts # match .ts files in the current directory echo *.ts # match files that start with "data", have a single number, then end with .csv echo data[0-9].csv ``` The supported glob characters are `*`, `?`, and `[`/`]`. ## Built-in commands `deno task` ships with several built-in commands that work the same out of the box on Windows, Mac, and Linux. - [`cp`](https://man7.org/linux/man-pages/man1/cp.1.html) - Copies files. - [`mv`](https://man7.org/linux/man-pages/man1/mv.1.html) - Moves files. - [`rm`](https://man7.org/linux/man-pages/man1/rm.1.html) - Remove files or directories. - Ex: `rm -rf [FILE]...` - Commonly used to recursively delete files or directories. - [`mkdir`](https://man7.org/linux/man-pages/man1/mkdir.1.html) - Makes directories. - Ex. `mkdir -p DIRECTORY...` - Commonly used to make a directory and all its parents with no error if it exists. - [`pwd`](https://man7.org/linux/man-pages/man1/pwd.1.html) - Prints the name of the current/working directory. - [`sleep`](https://man7.org/linux/man-pages/man1/sleep.1.html) - Delays for a specified amount of time. - Ex. `sleep 1` to sleep for 1 second, `sleep 0.5` to sleep for half a second, or `sleep 1m` to sleep a minute - [`echo`](https://man7.org/linux/man-pages/man1/echo.1.html) - Displays a line of text. - [`cat`](https://man7.org/linux/man-pages/man1/cat.1.html) - Concatenates files and outputs them on stdout. When no arguments are provided it reads and outputs stdin. - [`exit`](https://man7.org/linux/man-pages/man1/exit.1p.html) - Causes the shell to exit. - [`head`](https://man7.org/linux/man-pages/man1/head.1.html) - Output the first part of a file. - [`unset`](https://man7.org/linux/man-pages/man1/unset.1p.html) - Unsets environment variables. - [`xargs`](https://man7.org/linux/man-pages/man1/xargs.1p.html) - Builds arguments from stdin and executes a command. If you find a useful flag missing on a command or have any suggestions for additional commands that should be supported out of the box, then please [open an issue](https://github.com/denoland/deno_task_shell/issues) on the [deno_task_shell](https://github.com/denoland/deno_task_shell/) repo. Note that if you wish to execute any of these commands in a non-cross-platform way on Mac or Linux, then you may do so by running it through `sh`: `sh -c ` (ex. `sh -c cp source destination`). ## package.json support `deno task` falls back to reading from the `"scripts"` entries in a package.json file if it is discovered. Note that Deno does not respect or support any npm life cycle events like `preinstall` or `postinstall`—you must explicitly run the script entries you want to run (ex. `deno install --entrypoint main.ts && deno task postinstall`). --- # deno test > Run tests for your project with Deno's built-in test runner URL: https://docs.deno.com/runtime/reference/cli/test ## Additional information It can be executed in watch mode (`--watch`), supports parallel execution (`--parallel`), and can be configured to run tests in a random order with (`--shuffle`). Additionally, there is built in support for code coverage (`--coverage`) and leak detection (`--trace-leaks`). ## Examples Run tests ```bash deno test ``` Run tests in specific files ```bash deno test src/fetch_test.ts src/signal_test.ts ``` Run tests where glob matches ```bash deno test src/*.test.ts ``` Run tests and skip type-checking ```bash deno test --no-check ``` Run tests, re-running on file change ```bash deno test --watch ``` --- # deno types > Generate TypeScript types from your code URL: https://docs.deno.com/runtime/reference/cli/types --- # deno uninstall > Remove a dependency from your project or from your global cache URL: https://docs.deno.com/runtime/reference/cli/uninstall ## `deno uninstall [PACKAGES]` Remove dependencies specified in `deno.json` or `package.json`: ```shell $ deno add npm:express Add npm:express@5.0.0 $ cat deno.json { "imports": { "express": "npm:express@5.0.0" } } ``` ```shell $ deno uninstall express Removed express $ cat deno.json { "imports": {} } ``` :::tip You can also use `deno remove` which is an alias to `deno uninstall [PACKAGES]` ::: You can remove multiple dependencies at once: ```shell $ deno add npm:express jsr:@std/http Added npm:express@5.0.0 Added jsr:@std/http@1.0.7 $ cat deno.json { "imports": { "@std/http": "jsr:@std/http@^1.0.7", "express": "npm:express@^5.0.0", } } ``` ```shell $ deno remove express @std/http Removed express Removed @std/http $ cat deno.json { "imports": {} } ``` :::info While dependencies are removed from the `deno.json` and `package.json` they still persist in the global cache for future use. ::: If your project contains `package.json`, `deno uninstall` can work with it too: ```shell $ cat package.json { "dependencies": { "express": "^5.0.0" } } $ deno remove express Removed express $ cat package.json { "dependencies": {} } ``` ## `deno uninstall --global [SCRIPT_NAME]` Uninstall `serve` ```bash deno uninstall --global serve ``` Uninstall `serve` from a specific installation root ```bash deno uninstall -g --root /usr/local/bin serve ``` --- # Unstable feature flags URL: https://docs.deno.com/runtime/reference/cli/unstable_flags New features of the Deno runtime are often released behind feature flags, so users can try out new APIs and features before they are finalized. Current unstable feature flags are listed on this page, and can also be found in the CLI help text by running: ```sh deno --help ``` ## Using flags at the command line You can enable a feature flag when you run a Deno program from the command line by passing in the flag as an option to the CLI. Here's an example of running a program with the `--unstable-node-globals` flag enabled: ```sh deno run --unstable-node-globals main.ts ``` ## Configuring flags in `deno.json` You can specify which unstable features you'd like to enable for your project using a [configuration option in `deno.json`](/runtime/fundamentals/configuration/). ```json title="deno.json" { "unstable": ["bare-node-builtins", "webgpu"] } ``` The possible values in the `unstable` array are the flag names with the `--unstable-` prefix removed. ## Configuration via environment variables Some flags can be enabled by setting a value (any value) for an environment variable of a given name, rather than being passed as a flag or `deno.json` configuration option. Flags that are settable via environment variables will be noted below. Here's an example of setting the `--unstable-bare-node-builtins` flag via environment variable: ```sh export DENO_UNSTABLE_BARE_NODE_BUILTINS=true ``` ## `--unstable-bare-node-builtins` **Environment variable:** `DENO_UNSTABLE_BARE_NODE_BUILTINS` This flag enables you to [import Node.js built-in modules](/runtime/fundamentals/node/#node-built-in-modules) without a `node:` specifier, as in the example below. You can also use this flag to enable npm packages without an `npm:` specifier if you are manually managing your Node.js dependencies ([see `byonm` flag](#--unstable-byonm)). ```ts title="example.ts" import { readFileSync } from "fs"; console.log(readFileSync("deno.json", { encoding: "utf8" })); ``` ## `--unstable-detect-cjs` **Environment variable:** `DENO_UNSTABLE_DETECT_CJS` Loads `.js`, `.jsx`, `.ts`, and `.tsx` modules as possibly being CommonJS in the following additional scenarios: 1. The _package.json_ has no `"type"` field. 1. No _package.json_ exists. By default, Deno only loads these modules as being possibly CommonJS when you're in a project with a _package.json_ and the closest _package.json_ has `{ "type": "commonjs" }`. Requires Deno >= 2.1.2 ## `--unstable-node-globals` This flags injects Node specific globals into the global scope. The injected globals are: - `Buffer` - `global` - `setImmediate` - `clearImmediate` Note, that `process` is already available as a global starting with Deno 2.0. Requires Deno >= 2.1.0 ## `--unstable-sloppy-imports` **Environment variable:** `DENO_UNSTABLE_SLOPPY_IMPORTS` This flag enables behavior which will infer file extensions from imports that do not include them. Normally, the import statement below would produce an error: ```ts title="foo.ts" import { Example } from "./bar"; console.log(Example); ``` ```ts title="bar.ts" export const Example = "Example"; ``` Executing the script with sloppy imports enabled will remove the error, but provide guidance that a more performant syntax should be used. Sloppy imports will allow (but print warnings for) the following: - Omit file extensions from imports - Use incorrect file extensions (e.g. importing with a `.js` extension when the actual file is `.ts`) - Import a directory path, and automatically use `index.js` or `index.ts` as the import for that directory [`deno compile`](/runtime/reference/cli/compile/) does not support sloppy imports. ## `--unstable-unsafe-proto` Deno made a conscious decision to not support `Object.prototype.__proto__` for security reasons. However there are still many npm packages that rely on this property to work correctly. This flag enables this property. Note that it is not recommended to use this, but if you really need to use a package that relies on it, the escape hatch is now available to you. ## `--unstable-webgpu` Enable the [`WebGPU` API](https://developer.mozilla.org/en-US/docs/Web/API/WebGPU_API) in the global scope, as in the browser. Below is a simple example to get basic information about the GPU using this API: ```ts // Try to get an adapter from the user agent. const adapter = await navigator.gpu.requestAdapter(); if (adapter) { // Print out some basic details about the adapter. const adapterInfo = await adapter.requestAdapterInfo(); // On some systems this will be blank... console.log(`Found adapter: ${adapterInfo.device}`); // Print GPU feature list const features = [...adapter.features.values()]; console.log(`Supported features: ${features.join(", ")}`); } else { console.error("No adapter found"); } ``` Check out [this repository](https://github.com/denoland/webgpu-examples) for more examples using WebGPU. ## `--unstable-broadcast-channel` Enabling this flag makes the [`BroadcastChannel`](https://developer.mozilla.org/en-US/docs/Web/API/BroadcastChannel) web API available for use in the global scope, as in the browser. ## `--unstable-worker-options` Enable unstable [Web Worker](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers) API options. Specifically, it enables you to specify permissions available to workers: ```ts new Worker(`data:application/javascript;base64,${btoa(`postMessage("ok");`)}`, { type: "module", deno: { permissions: { read: true, }, }, }).onmessage = ({ data }) => { console.log(data); }; ``` ## `--unstable-cron` Enabling this flag makes the [`Deno.cron`](/deploy/kv/manual/cron) API available on the `Deno` namespace. ## `--unstable-kv` Enabling this flag makes [Deno KV](/deploy/kv/manual) APIs available in the `Deno` namespace. ## `--unstable-net` Enable unstable net APIs. These APIs include: - [`WebSocketStream`](https://developer.mozilla.org/en-US/docs/Web/API/WebSocketStream) - [`Deno.DatagramConn`](https://docs.deno.com/api/deno/~/Deno.DatagramConn) ## `--unstable-otel` Enable the [OpenTelemetry integration for Deno](/runtime/fundamentals/open_telemetry). ## `--unstable` :::caution --unstable is deprecated - use granular flags instead The `--unstable` flag is no longer being used for new features, and will be removed in a future release. All unstable features that were available using this flag are now available as granular unstable flags, notably: - `--unstable-kv` - `--unstable-cron` Please use these feature flags instead moving forward. ::: Before more recent Deno versions (1.38+), unstable APIs were made available all at once using the `--unstable` flag. Notably, [Deno KV](/deploy/kv/manual) and other cloud primitive APIs are available behind this flag. To run a program with access to these unstable features, you would run your script with: ```sh deno run --unstable your_script.ts ``` It is recommended that you use the granular unstable flags instead of this, the `--unstable` flag is now deprecated and will be removed in Deno 2. --- # deno upgrade > Upgrade Deno to the latest, or any specific version URL: https://docs.deno.com/runtime/reference/cli/upgrade ## Examples ### Upgrade to the latest version Use this command without any options to upgrade Deno to the latest available version: ```shell $ deno upgrade Checking for latest version Version has been found Deno is upgrading to version 1.38.5 downloading https://github.com/denoland/deno/releases/download/v1.38.5/deno-x86_64-apple-darwin.zip downloading 100% Upgrade done successfully ``` ### Upgrade to a specific version You can specify a particular version to upgrade to: ```shell $ deno upgrade --version 1.37.0 Checking for version 1.37.0 Version has been found Deno is upgrading to version 1.37.0 downloading https://github.com/denoland/deno/releases/download/v1.37.0/deno-x86_64-apple-darwin.zip downloading 100% Upgrade done successfully ``` ### Check available upgrade without installing Use the `--dry-run` flag to see what would be upgraded without actually performing the upgrade: ```shell $ deno upgrade --dry-run Checking for latest version Version has been found Would upgrade to version 1.38.5 ``` ## --quiet flag The `--quiet` flag suppresses diagnostic output during the upgrade process. When used with `deno upgrade`, it will hide progress indicators, download information, and success messages. ```shell $ deno upgrade --quiet ``` This is useful for scripting environments or when you want cleaner output in CI pipelines. ## Canary build By default, Deno will upgrade from the official GitHub releases. You can specify the `--canary` build flag for the latest canary build: ```shell # Upgrade to the latest canary build $ deno upgrade --canary ``` --- # Continuous integration > Guide to setting up continuous integration (CI) pipelines for Deno projects. Learn how to configure GitHub Actions workflows, run tests and linting in CI, handle cross-platform builds, and optimize pipeline performance with caching. URL: https://docs.deno.com/runtime/reference/continuous_integration Deno's built-in tools make it easy to set up Continuous Integration (CI) pipelines for your projects. [Testing](/runtime/fundamentals/testing), [linting and formatting](/runtime/fundamentals/linting_and_formatting/) your code can all be done with the corresponding commands `deno test`, `deno lint` and `deno fmt`. In addition, you can generate code coverage reports from test results with `deno coverage` in pipelines. ## Setting up a basic pipeline You can set up basic pipelines for Deno projects in GitHub Actions. The concepts explained on this page largely apply to other CI providers as well, such as Azure Pipelines, CircleCI or GitLab. Building a pipeline for Deno generally starts with checking out the repository and installing Deno: ```yaml name: Build on: push jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: denoland/setup-deno@v2 with: deno-version: v2.x # Run with latest stable Deno. ``` To expand the workflow, add any of the `deno` subcommands that you might need: ```yaml # Check if the code is formatted according to Deno's default # formatting conventions. - run: deno fmt --check # Scan the code for syntax errors and style issues. If # you want to use a custom linter configuration you can add a configuration file with --config - run: deno lint # Run all test files in the repository and collect code coverage. The example # runs with all permissions, but it is recommended to run with the minimal permissions your program needs (for example --allow-read). - run: deno test --allow-all --coverage=cov/ # This generates a report from the collected coverage in `deno test --coverage`. It is # stored as a .lcov file which integrates well with services such as Codecov, Coveralls and Travis CI. - run: deno coverage --lcov cov/ > cov.lcov ``` ## Cross-platform workflows As a Deno module maintainer, you probably want to know that your code works on all of the major operating systems in use today: Linux, MacOS and Windows. A cross-platform workflow can be achieved by running a matrix of parallel jobs, each one running the build on a different OS: ```yaml jobs: build: runs-on: ${{ matrix.os }} strategy: matrix: os: [ubuntu-latest, macos-latest, windows-latest] steps: - run: deno test --allow-all --coverage cov/ ``` :::caution Note: GitHub Actions has a known [issue](https://github.com/actions/checkout/issues/135) with handling Windows-style line endings (CRLF). This may cause issues when running `deno fmt` in a pipeline with jobs that run on `windows`. To prevent this, configure the Actions runner to use Linux-style line endings before running the `actions/checkout@v4` step: ```sh git config --system core.autocrlf false git config --system core.eol lf ``` ::: If you are working with experimental or unstable Deno APIs, you can include a matrix job running the canary version of Deno. This can help to spot breaking changes early on: ```yaml jobs: build: runs-on: ${{ matrix.os }} continue-on-error: ${{ matrix.canary }} # Continue in case the canary run does not succeed strategy: matrix: os: [ubuntu-latest, macos-latest, windows-latest] deno-version: [v1.x] canary: [false] include: - deno-version: canary os: ubuntu-latest canary: true ``` ## Speeding up Deno pipelines ### Reducing repetition In cross-platform runs, certain steps of a pipeline do not need to run for each OS necessarily. For example, generating the same test coverage report on Linux, MacOS and Windows is a bit redundant. You can use the `if` conditional keyword of GitHub Actions in these cases. The example below shows how to run code coverage generation and upload steps only on the `ubuntu` (Linux) runner: ```yaml - name: Generate coverage report if: matrix.os == 'ubuntu-latest' run: deno coverage --lcov cov > cov.lcov - name: Upload coverage to Coveralls.io if: matrix.os == 'ubuntu-latest' # Any code coverage service can be used, Coveralls.io is used here as an example. uses: coverallsapp/github-action@master with: github-token: ${{ secrets.GITHUB_TOKEN }} # Generated by GitHub. path-to-lcov: cov.lcov ``` ### Caching dependencies As a project grows in size, more and more dependencies tend to be included. Deno will download these dependencies during testing and if a workflow is run many times a day, this can become a time-consuming process. A common solution to speed things up is to cache dependencies so that they do not need to be downloaded anew. Deno stores dependencies locally in a cache directory. In a pipeline the cache can be preserved between workflows by turning on the `cache: true` option on `denoland/setup-deno`: ```yaml steps: - uses: actions/checkout@v4 - uses: denoland/setup-deno@v2 with: cache: true ``` At first, when this workflow runs the cache is still empty and commands like `deno test` will still have to download dependencies, but when the job succeeds the contents of cached dependencies are saved and any subsequent runs can restore them from cache instead of re-downloading. To demonstrate, let's say you have a project that uses the logger from [`@std/log`](https://jsr.io/@std/log): ```json, title="deno.json" { "imports": { "@std/log": "jsr:@std/log@0.224.5" } } ``` In order to increment this version, you can update the dependency and then reload the cache and update the lockfile locally: ```console deno install --reload --frozen=false ``` You should see changes in the lockfile's contents after running this. When this is committed and run through the pipeline, you should then see a new cache and using it in any runs that follow. By default, the cache is automatically keyed by: - the github [job_id](https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions#jobsjob_id) - the runner os and architecture - a hash of the `deno.lock` files in the project It is possible to customize the default hash (`${{ hashFiles('**/deno.lock') }}`) used as part of the cache key via the `cache-hash` input. ```yaml - uses: denoland/setup-deno@v2 with: # setting `cache-hash` implies `cache: true` and will replace # the default cache-hash of `${{ hashFiles('**/deno.lock') }}` cache-hash: ${{ hashFiles('**/deno.json') }} ``` --- # Deno Namespace APIs > A guide to Deno's built-in runtime APIs. Learn about file system operations, network functionality, permissions management, and other core capabilities available through the global Deno namespace. URL: https://docs.deno.com/runtime/reference/deno_namespace_apis The global `Deno` namespace contains APIs that are not web standard, including APIs for reading from files, opening TCP sockets, serving HTTP, and executing subprocesses, etc. Explore all Deno APIs Below we highlight some of the most important Deno APIs to know. ## File System The Deno runtime comes with [various functions for working with files and directories](/api/deno/file-system). You will need to use --allow-read and --allow-write permissions to gain access to the file system. Refer to the links below for code examples of how to use the file system functions. - [Reading files in streams](/examples/file_server_tutorial/) - [Reading a text file (`Deno.readTextFile`)](/examples/reading_files/) - [Writing a text file (`Deno.writeTextFile`)](/examples/writing_files/) ## Network The Deno runtime comes with [built-in functions for dealing with connections to network ports](/api/deno/network). Refer to the links below for code examples for common functions. - [Connect to the hostname and port (`Deno.connect`)](/api/deno/~/Deno.connect) - [Announcing on the local transport address (`Deno.listen`)](/api/deno/~/Deno.listen) For practical examples of networking functionality: - [HTTP Server: Hello world](/examples/http_server/) - [HTTP Server: Routing](/examples/http_server_routing/) - [TCP Echo Server](/examples/tcp_echo_server/) - [WebSockets example](/examples/http_server_websocket/) - [Build a chat app with WebSockets tutorial](/examples/chat_app_tutorial/) ## Subprocesses The Deno runtime comes with [built-in functions for spinning up subprocesses](/api/deno/subprocess). Refer to the links below for code samples of how to create a subprocess. - [Creating a subprocess (`Deno.Command`)](/examples/subprocess_tutorial/) - [Collecting output from subprocesses](/examples/subprocesses_output/) ## Errors The Deno runtime comes with [20 error classes](/api/deno/errors) that can be raised in response to a number of conditions. Some examples are: ```sh Deno.errors.NotFound; Deno.errors.WriteZero; ``` They can be used as below: ```ts try { const file = await Deno.open("./some/file.txt"); } catch (error) { if (error instanceof Deno.errors.NotFound) { console.error("the file was not found"); } else { // otherwise re-throw throw error; } } ``` ## HTTP Server Deno has two HTTP Server APIs: - [`Deno.serve`](/api/deno/~/Deno.serve): native, _higher-level_, supports HTTP/1.1 and HTTP2, this is the preferred API to write HTTP servers in Deno. - [`Deno.serveHttp`](/api/deno/~/Deno.serveHttp): native, _low-level_, supports HTTP/1.1 and HTTP2. To start an HTTP server on a given port, use the `Deno.serve` function. This function takes a handler function that will be called for each incoming request, and is expected to return a response (or a promise resolving to a response). For example: ```ts Deno.serve((_req) => { return new Response("Hello, World!"); }); ``` By default `Deno.serve` will listen on port `8000`, but this can be changed by passing in a port number in options bag as the first or second argument. You can [read more about how to use the HTTP server APIs](/runtime/fundamentals/http_server/). For practical examples of HTTP servers: - [Simple file server tutorial](/examples/file_server_tutorial/) - [HTTP server serving files](/examples/http_server_files/) - [HTTP server with streaming](/examples/http_server_streaming/) - [HTTP server WebSockets](/examples/http_server_websocket/) ## Permissions Permissions are granted from the CLI when running the `deno` command. User code will often assume its own set of required permissions, but there is no guarantee during execution that the set of **granted** permissions will align with this. In some cases, ensuring a fault-tolerant program requires a way to interact with the permission system at runtime. ### Permission descriptors On the CLI, read permission for `/foo/bar` is represented as `--allow-read=/foo/bar`. In runtime JS, it is represented as the following: ```ts const desc = { name: "read", path: "/foo/bar" } as const; ``` Other examples: ```ts // Global write permission. const desc1 = { name: "write" } as const; // Write permission to `$PWD/foo/bar`. const desc2 = { name: "write", path: "foo/bar" } as const; // Global net permission. const desc3 = { name: "net" } as const; // Net permission to 127.0.0.1:8000. const desc4 = { name: "net", host: "127.0.0.1:8000" } as const; // High-resolution time permission. const desc5 = { name: "hrtime" } as const; ``` See [`PermissionDescriptor`](/api/deno/~/Deno.PermissionDescriptor) in API reference for more details. Synchronous API counterparts (ex. `Deno.permissions.querySync`) exist for all the APIs described below. ### Query permissions Check, by descriptor, if a permission is granted or not. ```ts // deno run --allow-read=/foo main.ts const desc1 = { name: "read", path: "/foo" } as const; console.log(await Deno.permissions.query(desc1)); // PermissionStatus { state: "granted", partial: false } const desc2 = { name: "read", path: "/foo/bar" } as const; console.log(await Deno.permissions.query(desc2)); // PermissionStatus { state: "granted", partial: false } const desc3 = { name: "read", path: "/bar" } as const; console.log(await Deno.permissions.query(desc3)); // PermissionStatus { state: "prompt", partial: false } ``` If `--deny-read` flag was used to restrict some of the filepaths, the result will contain `partial: true` describing that not all subpaths have permissions granted: ```ts // deno run --allow-read=/foo --deny-read=/foo/bar main.ts const desc1 = { name: "read", path: "/foo" } as const; console.log(await Deno.permissions.query(desc1)); // PermissionStatus { state: "granted", partial: true } const desc2 = { name: "read", path: "/foo/bar" } as const; console.log(await Deno.permissions.query(desc2)); // PermissionStatus { state: "denied", partial: false } const desc3 = { name: "read", path: "/bar" } as const; console.log(await Deno.permissions.query(desc3)); // PermissionStatus { state: "prompt", partial: false } ``` ### Permission states A permission state can be either "granted", "prompt" or "denied". Permissions which have been granted from the CLI will query to `{ state: "granted" }`. Those which have not been granted query to `{ state: "prompt" }` by default, while `{ state: "denied" }` reserved for those which have been explicitly refused. This will come up in [Request permissions](#request-permissions). ### Permission strength The intuitive understanding behind the result of the second query in [Query permissions](#query-permissions) is that read access was granted to `/foo` and `/foo/bar` is within `/foo` so `/foo/bar` is allowed to be read. This hold true, unless the CLI-granted permission is _partial_ to the queried permissions (as an effect of using a `--deny-*` flag). We can also say that `desc1` is _[stronger than](https://www.w3.org/TR/permissions/#ref-for-permissiondescriptor-stronger-than)_ `desc2`. This means that for any set of CLI-granted permissions: 1. If `desc1` queries to `{ state: "granted", partial: false }` then so must `desc2`. 2. If `desc2` queries to `{ state: "denied", partial: false }` then so must `desc1`. More examples: ```ts const desc1 = { name: "write" } as const; // is stronger than const desc2 = { name: "write", path: "/foo" } as const; const desc3 = { name: "net", host: "127.0.0.1" } as const; // is stronger than const desc4 = { name: "net", host: "127.0.0.1:8000" } as const; ``` ### Request permissions Request an ungranted permission from the user via CLI prompt. ```ts // deno run main.ts const desc1 = { name: "read", path: "/foo" } as const; const status1 = await Deno.permissions.request(desc1); // ⚠️ Deno requests read access to "/foo". Grant? [y/n (y = yes allow, n = no deny)] y console.log(status1); // PermissionStatus { state: "granted", partial: false } const desc2 = { name: "read", path: "/bar" } as const; const status2 = await Deno.permissions.request(desc2); // ⚠️ Deno requests read access to "/bar". Grant? [y/n (y = yes allow, n = no deny)] n console.log(status2); // PermissionStatus { state: "denied", partial: false } ``` If the current permission state is "prompt", a prompt will appear on the user's terminal asking them if they would like to grant the request. The request for `desc1` was granted so its new status is returned and execution will continue as if `--allow-read=/foo` was specified on the CLI. The request for `desc2` was denied so its permission state is downgraded from "prompt" to "denied". If the current permission state is already either "granted" or "denied", the request will behave like a query and just return the current status. This prevents prompts both for already granted permissions and previously denied requests. ### Revoke permissions Downgrade a permission from "granted" to "prompt". ```ts // deno run --allow-read=/foo main.ts const desc = { name: "read", path: "/foo" } as const; console.log(await Deno.permissions.revoke(desc)); // PermissionStatus { state: "prompt", partial: false } ``` What happens when you try to revoke a permission which is _partial_ to one granted on the CLI? ```ts // deno run --allow-read=/foo main.ts const desc = { name: "read", path: "/foo/bar" } as const; console.log(await Deno.permissions.revoke(desc)); // PermissionStatus { state: "prompt", partial: false } const cliDesc = { name: "read", path: "/foo" } as const; console.log(await Deno.permissions.revoke(cliDesc)); // PermissionStatus { state: "prompt", partial: false } ``` The CLI-granted permission, which implies the revoked permission, was also revoked. To understand this behavior, imagine that Deno stores an internal set of _explicitly granted permission descriptors_. Specifying `--allow-read=/foo,/bar` on the CLI initializes this set to: ```ts [ { name: "read", path: "/foo" }, { name: "read", path: "/bar" }, ]; ``` Granting a runtime request for `{ name: "write", path: "/foo" }` updates the set to: ```ts [ { name: "read", path: "/foo" }, { name: "read", path: "/bar" }, { name: "write", path: "/foo" }, ]; ``` Deno's permission revocation algorithm works by removing every element from this set which is _stronger than_ the argument permission descriptor. Deno does not allow "fragmented" permission states, where some strong permission is granted with exclusions of weak permissions implied by it. Such a system would prove increasingly complex and unpredictable as you factor in a wider variety of use cases and the `"denied"` state. This is a calculated trade-off of granularity for security. ## import.meta Deno supports a number of properties and methods on the [`import.meta`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import.meta) API. It can be used to get information about the module, such as the module's URL. ### import.meta.url Returns the URL of the current module. ```ts title="main.ts" console.log(import.meta.url); ``` ```sh $ deno run main.ts file:///dev/main.ts $ deno run https:/example.com/main.ts https://example.com/main.ts ``` ### import.meta.main Returns whether the current module is the entry point to your program. ```ts title="main.ts" import "./other.ts"; console.log(`Is ${import.meta.url} the main module?`, import.meta.main); ``` ```ts title="other.ts" console.log(`Is ${import.meta.url} the main module?`, import.meta.main); ``` ```sh $ deno run main.ts Is file:///dev/other.ts the main module? false Is file:///dev/main.ts the main module? true ``` ### import.meta.filename _This property is only available for local modules (module that have `file:///...` specifier) and returns `undefined` for remote modules._ Returns the fully resolved path to the current module. The value contains OS specific path separators. ```ts title="main.ts" console.log(import.meta.filename); ``` On Unix: ```sh $ deno run main.ts /dev/main.ts $ deno run https://example.com/main.ts undefined ``` On Windows: ```sh $ deno run main.ts C:\dev\main.ts $ deno run https://example.com/main.ts undefined ``` ### import.meta.dirname _This property is only available for local modules (module that have `file:///...` specifier) and returns `undefined` for remote modules._ Returns the fully resolved path to the directory containing the current module. The value contains OS specific path separators. ```ts title="main.ts" console.log(import.meta.dirname); ``` On Unix: ```sh $ deno run main.ts /dev/ $ deno run https://example.com/main.ts undefined ``` On Windows: ```sh $ deno run main.ts C:\dev\ $ deno run https://example.com/main.ts undefined ``` ### import.meta.resolve Resolve specifiers relative to the current module. ```ts const worker = new Worker(import.meta.resolve("./worker.ts")); ``` The `import.meta.resolve` API takes into account the currently applied import map, which gives you the ability to resolve "bare" specifiers as well. With such import map loaded... ```json { "imports": { "fresh": "https://deno.land/x/fresh@1.0.1/dev.ts" } } ``` ...you can now resolve: ```js title="resolve.js" console.log(import.meta.resolve("fresh")); ``` ```sh $ deno run resolve.js https://deno.land/x/fresh@1.0.1/dev.ts ``` ## FFI The FFI (foreign function interface) API allows users to call libraries written in native languages that support the C ABIs (C/C++, Rust, Zig, V, etc.) using `Deno.dlopen`. Here's an example showing how to call a Rust function from Deno: ```rust // add.rs #[no_mangle] pub extern "C" fn add(a: isize, b: isize) -> isize { a + b } ``` Compile it to a C dynamic library (`libadd.so` on Linux): ```sh rustc --crate-type cdylib add.rs ``` In C you can write it as: ```c // add.c int add(int a, int b) { return a + b; } ``` And compile it: ```sh // unix cc -c -o add.o add.c cc -shared -W -o libadd.so add.o // Windows cl /LD add.c /link /EXPORT:add ``` Calling the library from Deno: ```typescript // ffi.ts // Determine library extension based on // your OS. let libSuffix = ""; switch (Deno.build.os) { case "windows": libSuffix = "dll"; break; case "darwin": libSuffix = "dylib"; break; default: libSuffix = "so"; break; } const libName = `./libadd.${libSuffix}`; // Open library and define exported symbols const dylib = Deno.dlopen( libName, { "add": { parameters: ["isize", "isize"], result: "isize" }, } as const, ); // Call the symbol `add` const result = dylib.symbols.add(35, 34); // 69 console.log(`Result from external addition of 35 and 34: ${result}`); ``` Run with `--allow-ffi` and `--unstable` flag: ```sh deno run --allow-ffi --unstable ffi.ts ``` ### Non-blocking FFI There are many use cases where users might want to run CPU-bound FFI functions in the background without blocking other tasks on the main thread. As of Deno 1.15, symbols can be marked `nonblocking` in `Deno.dlopen`. These function calls will run on a dedicated blocking thread and will return a `Promise` resolving to the desired `result`. Example of executing expensive FFI calls with Deno: ```c // sleep.c #ifdef _WIN32 #include #else #include #endif int sleep(unsigned int ms) { #ifdef _WIN32 Sleep(ms); #else struct timespec ts; ts.tv_sec = ms / 1000; ts.tv_nsec = (ms % 1000) * 1000000; nanosleep(&ts, NULL); #endif } ``` Calling it from Deno: ```typescript // nonblocking_ffi.ts const library = Deno.dlopen( "./sleep.so", { sleep: { parameters: ["usize"], result: "void", nonblocking: true, }, } as const, ); library.symbols.sleep(500).then(() => console.log("After")); console.log("Before"); ``` Result: ```sh $ deno run --allow-ffi --unstable unblocking_ffi.ts Before After ``` ### Callbacks Deno FFI API supports creating C callbacks from JavaScript functions for calling back into Deno from dynamic libraries. An example of how callbacks are created and used is as follows: ```typescript // callback_ffi.ts const library = Deno.dlopen( "./callback.so", { set_status_callback: { parameters: ["function"], result: "void", }, start_long_operation: { parameters: [], result: "void", }, check_status: { parameters: [], result: "void", }, } as const, ); const callback = new Deno.UnsafeCallback( { parameters: ["u8"], result: "void", } as const, (success: number) => {}, ); // Pass the callback pointer to dynamic library library.symbols.set_status_callback(callback.pointer); // Start some long operation that does not block the thread library.symbols.start_long_operation(); // Later, trigger the library to check if the operation is done. // If it is, this call will trigger the callback. library.symbols.check_status(); ``` If an `UnsafeCallback`'s callback function throws an error, the error will get propagated up to the function that triggered the callback to be called (above, that would be `check_status()`) and can be caught there. If a callback returning a value throws then Deno will return 0 (null pointer for pointers) as the result. `UnsafeCallback` is not deallocated by default as it can cause use-after-free bugs. To properly dispose of an `UnsafeCallback` its `close()` method must be called. ```typescript const callback = new Deno.UnsafeCallback( { parameters: [], result: "void" } as const, () => {}, ); // After callback is no longer needed callback.close(); // It is no longer safe to pass the callback as a parameter. ``` It is also possible for native libraries to setup interrupt handlers and to have those directly trigger the callback. However, this is not recommended and may cause unexpected side-effects and undefined behaviour. Preferably any interrupt handlers would only set a flag that can later be polled similarly to how `check_status()` is used above. ### Supported types Here's a list of types supported currently by the Deno FFI API. | FFI Type | Deno | C | Rust | | ---------------------- | -------------------- | ------------------------ | ------------------------- | | `i8` | `number` | `char` / `signed char` | `i8` | | `u8` | `number` | `unsigned char` | `u8` | | `i16` | `number` | `short int` | `i16` | | `u16` | `number` | `unsigned short int` | `u16` | | `i32` | `number` | `int` / `signed int` | `i32` | | `u32` | `number` | `unsigned int` | `u32` | | `i64` | `bigint` | `long long int` | `i64` | | `u64` | `bigint` | `unsigned long long int` | `u64` | | `usize` | `bigint` | `size_t` | `usize` | | `isize` | `bigint` | `size_t` | `isize` | | `f32` | `number` | `float` | `f32` | | `f64` | `number` | `double` | `f64` | | `void`[1] | `undefined` | `void` | `()` | | `pointer` | `{} \| null` | `void *` | `*mut c_void` | | `buffer`[2] | `TypedArray \| null` | `uint8_t *` | `*mut u8` | | `function`[3] | `{} \| null` | `void (*fun)()` | `Option` | | `{ struct: [...] }`[4] | `TypedArray` | `struct MyStruct` | `MyStruct` | As of Deno 1.25, the `pointer` type has been split into a `pointer` and a `buffer` type to ensure users take advantage of optimizations for Typed Arrays, and as of Deno 1.31 the JavaScript representation of `pointer` has become an opaque pointer object or `null` for null pointers. - [1] `void` type can only be used as a result type. - [2] `buffer` type accepts TypedArrays as parameter, but it always returns a pointer object or `null` when used as result type like the `pointer` type. - [3] `function` type works exactly the same as the `pointer` type as a parameter and result type. - [4] `struct` type is for passing and returning C structs by value (copy). The `struct` array must enumerate each of the struct's fields' type in order. The structs are padded automatically: Packed structs can be defined by using an appropriate amount of `u8` fields to avoid padding. Only TypedArrays are supported as structs, and structs are always returned as `Uint8Array`s. ### deno_bindgen [`deno_bindgen`](https://github.com/denoland/deno_bindgen) is the official tool to simplify glue code generation of Deno FFI libraries written in Rust. It is similar to [`wasm-bindgen`](https://github.com/rustwasm/wasm-bindgen) in the Rust Wasm ecosystem. Here's an example showing its usage: ```rust // mul.rs use deno_bindgen::deno_bindgen; #[deno_bindgen] struct Input { a: i32, b: i32, } #[deno_bindgen] fn mul(input: Input) -> i32 { input.a * input.b } ``` Run `deno_bindgen` to generate bindings. You can now directly import them into Deno: ```ts // mul.ts import { mul } from "./bindings/bindings.ts"; mul({ a: 10, b: 2 }); // 20 ``` Any issues related to `deno_bindgen` should be reported at https://github.com/denoland/deno_bindgen/issues ## Program Lifecycle Deno supports browser compatible lifecycle events: - [`load`](https://developer.mozilla.org/en-US/docs/Web/API/Window/load_event#:~:text=The%20load%20event%20is%20fired,for%20resources%20to%20finish%20loading.): fired when the whole page has loaded, including all dependent resources such as stylesheets and images. - [`beforeunload`](https://developer.mozilla.org/en-US/docs/Web/API/Window/beforeunload_event#:~:text=The%20beforeunload%20event%20is%20fired,want%20to%20leave%20the%20page.): fired when the event loop has no more work to do and is about to exit. Scheduling more asynchronous work (like timers or network requests) will cause the program to continue. - [`unload`](https://developer.mozilla.org/en-US/docs/Web/API/Window/unload_event): fired when the document or a child resource is being unloaded. - [`unhandledrejection`](https://developer.mozilla.org/en-US/docs/Web/API/Window/unhandledrejection_event): fired when a promise that has no rejection handler is rejected, ie. a promise that has no `.catch()` handler or a second argument to `.then()`. - [`rejectionhandled`](https://developer.mozilla.org/en-US/docs/Web/API/Window/rejectionhandled_event): fired when a `.catch()` handler is added to a a promise that has already rejected. This event is fired only if there's `unhandledrejection` listener installed that prevents propagation of the event (which would result in the program terminating with an error). You can use these events to provide setup and cleanup code in your program. Listeners for `load` events can be asynchronous and will be awaited, this event cannot be canceled. Listeners for `beforeunload` need to be synchronous and can be cancelled to keep the program running. Listeners for `unload` events need to be synchronous and cannot be cancelled. **main.ts** ```ts title="main.ts" import "./imported.ts"; const handler = (e: Event): void => { console.log(`got ${e.type} event in event handler (main)`); }; globalThis.addEventListener("load", handler); globalThis.addEventListener("beforeunload", handler); globalThis.addEventListener("unload", handler); globalThis.onload = (e: Event): void => { console.log(`got ${e.type} event in onload function (main)`); }; globalThis.onbeforeunload = (e: Event): void => { console.log(`got ${e.type} event in onbeforeunload function (main)`); }; globalThis.onunload = (e: Event): void => { console.log(`got ${e.type} event in onunload function (main)`); }; console.log("log from main script"); ``` ```ts title="imported.ts" const handler = (e: Event): void => { console.log(`got ${e.type} event in event handler (imported)`); }; globalThis.addEventListener("load", handler); globalThis.addEventListener("beforeunload", handler); globalThis.addEventListener("unload", handler); globalThis.onload = (e: Event): void => { console.log(`got ${e.type} event in onload function (imported)`); }; globalThis.onbeforeunload = (e: Event): void => { console.log(`got ${e.type} event in onbeforeunload function (imported)`); }; globalThis.onunload = (e: Event): void => { console.log(`got ${e.type} event in onunload function (imported)`); }; console.log("log from imported script"); ``` A couple notes on this example: - `addEventListener` and `onload`/`onunload` are prefixed with `globalThis`, but you could also use `self` or no prefix at all. [It is not recommended to use `window` as a prefix](https://docs.deno.com/lint/rules/no-window-prefix). - You can use `addEventListener` and/or `onload`/`onunload` to define handlers for events. There is a major difference between them, let's run the example: ```shell $ deno run main.ts log from imported script log from main script got load event in event handler (imported) got load event in event handler (main) got load event in onload function (main) got onbeforeunload event in event handler (imported) got onbeforeunload event in event handler (main) got onbeforeunload event in onbeforeunload function (main) got unload event in event handler (imported) got unload event in event handler (main) got unload event in onunload function (main) ``` All listeners added using `addEventListener` were run, but `onload`, `onbeforeunload` and `onunload` defined in `main.ts` overrode handlers defined in `imported.ts`. In other words, you can use `addEventListener` to register multiple `"load"` or `"unload"` event handlers, but only the last defined `onload`, `onbeforeunload`, `onunload` event handlers will be executed. It is preferable to use `addEventListener` when possible for this reason. ### beforeunload ```js // beforeunload.js let count = 0; console.log(count); globalThis.addEventListener("beforeunload", (e) => { console.log("About to exit..."); if (count < 4) { e.preventDefault(); console.log("Scheduling more work..."); setTimeout(() => { console.log(count); }, 100); } count++; }); globalThis.addEventListener("unload", (e) => { console.log("Exiting"); }); count++; console.log(count); setTimeout(() => { count++; console.log(count); }, 100); ``` Running this program will print: ```sh $ deno run beforeunload.js 0 1 2 About to exit... Scheduling more work... 3 About to exit... Scheduling more work... 4 About to exit... Exiting ``` ### unhandledrejection event This event is fired when a promise that has no rejection handler is rejected, ie. a promise that has no .catch() handler or a second argument to .then(). ```js // unhandledrejection.js globalThis.addEventListener("unhandledrejection", (e) => { console.log("unhandled rejection at:", e.promise, "reason:", e.reason); e.preventDefault(); }); function Foo() { this.bar = Promise.reject(new Error("bar not available")); } new Foo(); Promise.reject(); ``` Running this program will print: ```sh $ deno run unhandledrejection.js unhandled rejection at: Promise { Error: bar not available at new Foo (file:///dev/unhandled_rejection.js:7:29) at file:///dev/unhandled_rejection.js:10:1 } reason: Error: bar not available at new Foo (file:///dev/unhandled_rejection.js:7:29) at file:///dev/unhandled_rejection.js:10:1 unhandled rejection at: Promise { undefined } reason: undefined ``` --- # Create working directory > Complete guide to using Deno with Docker containers. Learn about official Deno images, writing Dockerfiles, multi-stage builds, workspace containerization, and Docker best practices for Deno applications. URL: https://docs.deno.com/runtime/reference/docker ## Using Deno with Docker Deno provides [official Docker files](https://github.com/denoland/deno_docker) and [images](https://hub.docker.com/r/denoland/deno). To use the official image, create a `Dockerfile` in your project directory: ```dockerfile FROM denoland/deno:latest # Create working directory WORKDIR /app # Copy source COPY . . # Compile the main app RUN deno cache main.ts # Run the app CMD ["deno", "run", "--allow-net", "main.ts"] ``` ### Best Practices #### Use Multi-stage Builds For smaller production images: ```dockerfile # Build stage FROM denoland/deno:latest as builder WORKDIR /app COPY . . RUN deno cache main.ts # Production stage FROM denoland/deno:latest WORKDIR /app COPY --from=builder /app . CMD ["deno", "run", "--allow-net", "main.ts"] ``` #### Permission Flags Specify required permissions explicitly: ```dockerfile CMD ["deno", "run", "--allow-net=api.example.com", "--allow-read=/data", "main.ts"] ``` #### Development Container For development with hot-reload: ```dockerfile FROM denoland/deno:latest WORKDIR /app COPY . . CMD ["deno", "run", "--watch", "--allow-net", "main.ts"] ``` ### Common Issues and Solutions 1. **Permission Denied Errors** - Use `--allow-*` flags appropriately - Consider using `deno.json` permissions 2. **Large Image Sizes** - Use multi-stage builds - Include only necessary files - Add proper `.dockerignore` 3. **Cache Invalidation** - Separate dependency caching - Use proper layer ordering ### Example .dockerignore ```text .git .gitignore Dockerfile README.md *.log _build/ node_modules/ ``` ### Available Docker Tags Deno provides several official tags: - `denoland/deno:latest` - Latest stable release - `denoland/deno:alpine` - Alpine-based smaller image - `denoland/deno:distroless` - Google's distroless-based image - `denoland/deno:ubuntu` - Ubuntu-based image - `denoland/deno:1.x` - Specific version tags ### Environment Variables Common environment variables for Deno in Docker: ```dockerfile ENV DENO_DIR=/deno-dir/ ENV DENO_INSTALL_ROOT=/usr/local ENV PATH=${DENO_INSTALL_ROOT}/bin:${PATH} # Optional environment variables ENV DENO_NO_UPDATE_CHECK=1 ENV DENO_NO_PROMPT=1 ``` ### Running Tests in Docker ```dockerfile FROM denoland/deno:latest WORKDIR /app COPY . . # Run tests CMD ["deno", "test", "--allow-none"] ``` ### Using Docker Compose ```yaml // filepath: docker-compose.yml version: "3.8" services: deno-app: build: . volumes: - .:/app ports: - "8000:8000" environment: - DENO_ENV=development command: ["deno", "run", "--watch", "--allow-net", "main.ts"] ``` ### Health Checks ```dockerfile HEALTHCHECK --interval=30s --timeout=3s \ CMD deno eval "try { await fetch('http://localhost:8000/health'); } catch { Deno.exit(1); }" ``` ### Common Development Workflow For local development: 1. Build the image: `docker build -t my-deno-app .` 2. Run with volume mount: ```bash docker run -it --rm \ -v ${PWD}:/app \ -p 8000:8000 \ my-deno-app ``` ### Security Considerations - Run as non-root user: ```dockerfile # Create deno user RUN addgroup --system deno && \ adduser --system --ingroup deno deno # Switch to deno user USER deno # Continue with rest of Dockerfile ``` - Use minimal permissions: ```dockerfile CMD ["deno", "run", "--allow-net=api.example.com", "--allow-read=/app", "main.ts"] ``` - Consider using `--deny-*` flags for additional security ## Working with Workspaces in Docker When working with Deno workspaces (monorepos) in Docker, there are two main approaches: ### 1. Full Workspace Containerization Include the entire workspace when you need all dependencies: ```dockerfile FROM denoland/deno:latest WORKDIR /app # Copy entire workspace COPY deno.json . COPY project-a/ ./project-a/ COPY project-b/ ./project-b/ WORKDIR /app/project-a CMD ["deno", "run", "-A", "mod.ts"] ``` ### 2. Minimal Workspace Containerization For smaller images, include only required workspace members: 1. Create a build context structure: ```sh project-root/ ├── docker/ │ └── project-a/ │ ├── Dockerfile │ ├── .dockerignore │ └── build-context.sh ├── deno.json ├── project-a/ └── project-b/ ``` 2. Create a `.dockerignore`: ```text // filepath: docker/project-a/.dockerignore * !deno.json !project-a/** !project-b/** # Only if needed ``` 3. Create a build context script: ```bash // filepath: docker/project-a/build-context.sh #!/bin/bash # Create temporary build context BUILD_DIR="./tmp-build-context" mkdir -p $BUILD_DIR # Copy workspace configuration cp ../../deno.json $BUILD_DIR/ # Copy main project cp -r ../../project-a $BUILD_DIR/ # Copy only required dependencies if grep -q "\"@scope/project-b\"" "../../project-a/mod.ts"; then cp -r ../../project-b $BUILD_DIR/ fi ``` 4. Create a minimal Dockerfile: ```dockerfile // filepath: docker/project-a/Dockerfile FROM denoland/deno:latest WORKDIR /app # Copy only necessary files COPY deno.json . COPY project-a/ ./project-a/ # Copy dependencies only if required COPY project-b/ ./project-b/ WORKDIR /app/project-a CMD ["deno", "run", "-A", "mod.ts"] ``` 5. Build the container: ```bash cd docker/project-a ./build-context.sh docker build -t project-a -f Dockerfile tmp-build-context rm -rf tmp-build-context ``` ### Best Practices - Always include the root `deno.json` file - Maintain the same directory structure as development - Document workspace dependencies clearly - Use build scripts to manage context - Include only required workspace members - Update `.dockerignore` when dependencies change --- # Documentation Tests > Learn how to write and run documentation tests in Deno. This guide covers how to create testable code examples in documentation comments, type-checking documentation, and running doc tests with the Deno test runner. URL: https://docs.deno.com/runtime/reference/documentation Deno supports both type-checking evaluating your documentation examples. This makes sure that examples within your documentation are up to date and working. The basic idea is this: ````ts /** * # Examples * * ```ts * const x = 42; * ``` */ ```` The triple backticks mark the start and end of code blocks, the language is determined by the language identifier attribute which may be any of the following: - `js` - `javascript` - `mjs` - `cjs` - `jsx` - `ts` - `typescript` - `mts` - `cts` - `tsx` If no language identifier is specified then the language is inferred from media type of the source document that the code block is extracted from. Another attribute supported is `ignore`, which tells the test runner to skip type-checking the code block. ````ts /** * # Does not pass type check * * ```typescript ignore * const x: string = 42; * ``` */ ```` If this example was in a file named foo.ts, running `deno test --doc foo.ts` will extract this example, and then both type-check and evaluate it as a standalone module living in the same directory as the module being documented. To document your exports, import the module using a relative path specifier: ````ts /** * # Examples * * ```ts * import { foo } from "./foo.ts"; * ``` */ export function foo(): string { return "foo"; } ```` For more guides on testing in Deno, check out: - [Basic testing tutorial](/examples/testing_tutorial/) - [Mocking data in tests tutorial](/examples/mocking_tutorial/) - [Testing web applications tutorial](/examples/web_testing_tutorial/) --- # Environment variables > A guide to working with environment variables in Deno. Learn about Deno.env API, .env file support, CLI configuration, and special environment variables that control Deno's behavior. URL: https://docs.deno.com/runtime/reference/env_variables There are a few ways to use environment variables in Deno: ## Built-in Deno.env method The Deno runtime offers built-in support for environment variables with [`Deno.env`](https://docs.deno.com/api/deno/~/Deno.env). `Deno.env` has getter and setter methods. Here is example usage: ```ts Deno.env.set("FIREBASE_API_KEY", "examplekey123"); Deno.env.set("FIREBASE_AUTH_DOMAIN", "firebasedomain.com"); console.log(Deno.env.get("FIREBASE_API_KEY")); // examplekey123 console.log(Deno.env.get("FIREBASE_AUTH_DOMAIN")); // firebasedomain.com console.log(Deno.env.has("FIREBASE_AUTH_DOMAIN")); // true ``` ## .env file Deno also supports `.env` files. You can tell Deno to read environment variables from `.env` with the `--env-file` flag, for example: ```sh deno run --env-file main.ts ``` This will read the `.env` file from the current working directory or the first parent directory that contains one. If you want to load environment variables from a different file, you can specify that file as a parameter to the flag. You can pass multiple `--env-file` flags (e.g., `deno run --env-file=.env.one --env-file=.env.two --allow-env