Browse Source

docs: remove Hiro docs

friedger-patch-7
Patrick Gray 3 years ago
committed by Patrick Gray
parent
commit
459a39b964
  1. 110
      src/common/navigation.yaml
  2. 20
      src/pages/404.md
  3. 225
      src/pages/build-apps/collections/overview.md
  4. 157
      src/pages/build-apps/collections/types.md
  5. 216
      src/pages/build-apps/examples/angular.md
  6. 637
      src/pages/build-apps/examples/heystack.md
  7. 227
      src/pages/build-apps/examples/indexing.md
  8. 618
      src/pages/build-apps/examples/public-registry.md
  9. 280
      src/pages/build-apps/examples/todos.md
  10. 274
      src/pages/build-apps/guides/authentication.md
  11. 172
      src/pages/build-apps/guides/data-storage.md
  12. 202
      src/pages/build-apps/guides/integrate-stacking-delegation.md
  13. 332
      src/pages/build-apps/guides/integrate-stacking.md
  14. 455
      src/pages/build-apps/guides/transaction-signing.md
  15. 129
      src/pages/build-apps/indexing/collaboration.md
  16. 347
      src/pages/build-apps/indexing/models.md
  17. 93
      src/pages/build-apps/indexing/overview.md
  18. 163
      src/pages/build-apps/indexing/server.md
  19. 19
      src/pages/build-apps/overview.md
  20. 76
      src/pages/build-apps/references/authentication.md
  21. 3
      src/pages/build-apps/references/bns.md
  22. 18
      src/pages/index.md
  23. 2
      src/pages/references/bns-contract.md
  24. 2
      src/pages/references/faqs.md
  25. 46
      src/pages/references/stacks-cli.md
  26. 5
      src/pages/start-mining/mainnet.md
  27. 5
      src/pages/start-mining/testnet.md
  28. 161
      src/pages/storage-hubs/amazon-ec2-deploy.md
  29. 583
      src/pages/storage-hubs/digital-ocean-deploy.md
  30. 497
      src/pages/storage-hubs/gaia-admin.md
  31. 16
      src/pages/storage-hubs/overview.md
  32. 11
      src/pages/understand-stacks/accounts.md
  33. 131
      src/pages/understand-stacks/command-line-interface.md
  34. 231
      src/pages/understand-stacks/local-development.md
  35. 250
      src/pages/understand-stacks/managing-accounts.md
  36. 5
      src/pages/understand-stacks/microblocks.md
  37. 12
      src/pages/understand-stacks/network.md
  38. 24
      src/pages/understand-stacks/overview.md
  39. 36
      src/pages/understand-stacks/regtest.md
  40. 303
      src/pages/understand-stacks/running-api-node.md
  41. 2
      src/pages/understand-stacks/running-mainnet-node.md
  42. 251
      src/pages/understand-stacks/running-regtest-node.md
  43. 2
      src/pages/understand-stacks/running-testnet-node.md
  44. 225
      src/pages/understand-stacks/sending-tokens.md
  45. 151
      src/pages/understand-stacks/stacking-using-CLI.md
  46. 5
      src/pages/understand-stacks/stacking.md
  47. 326
      src/pages/understand-stacks/stacks-blockchain-api.md
  48. 9
      src/pages/understand-stacks/technical-specs.md
  49. 9
      src/pages/understand-stacks/testnet.md
  50. 564
      src/pages/understand-stacks/transactions.md
  51. 300
      src/pages/write-smart-contracts/billboard-tutorial.md
  52. 242
      src/pages/write-smart-contracts/clarinet.md
  53. 265
      src/pages/write-smart-contracts/counter-tutorial.md
  54. 156
      src/pages/write-smart-contracts/devnet.md
  55. 269
      src/pages/write-smart-contracts/hello-world-tutorial.md
  56. 24
      src/pages/write-smart-contracts/install-source.md
  57. 291
      src/pages/write-smart-contracts/nft-tutorial.md
  58. 10
      src/pages/write-smart-contracts/overview.md
  59. 143
      src/pages/write-smart-contracts/testing-contracts.md

110
src/common/navigation.yaml

@ -13,27 +13,14 @@ sections:
- path: /network
- path: /microblocks
- path: /stacking
- path: /command-line-interface
- path: /local-development
- path: /technical-specs
- path: /stacks-blockchain-api
sections:
- title: Community
usePageTitles: true
pages:
- external:
href: 'https://github.com/friedger/awesome-stacks-chain'
title: 'Awesome Stacks'
- title: Tutorials
usePageTitles: true
pages:
- path: /managing-accounts
- path: /sending-tokens
- path: /running-mainnet-node
- path: /running-testnet-node
- path: /running-regtest-node
- path: /running-api-node
- path: /stacking-using-CLI
- path: /write-smart-contracts
usePageTitles: true
@ -41,101 +28,47 @@ sections:
- path: /overview
- path: /principals
- path: /values
- path: /clarinet
- path: /devnet
- path: /tokens
sections:
- title: Tutorials
- title: Resources
pages:
- path: /hello-world-tutorial
- path: /counter-tutorial
- path: /nft-tutorial
- path: /billboard-tutorial
- path: /testing-contracts
- external:
href: 'https://stacks.org/clarity-universe'
title: Clarity Universe
- external:
href: 'https://docs.hiro.so/tutorials'
title: Hiro Clarity tutorials
- path: /build-apps
pages:
- path: /overview
- external:
href: https://github.com/blockstack/stacks.js/tree/master/migration-guide.md
title: Migrating from Blockstack.js
sections:
- title: Guides
usePageTitles: true
pages:
- path: /guides/authentication
- path: /guides/transaction-signing
- path: /guides/data-storage
- path: /guides/integrate-stacking
- path: /guides/integrate-stacking-delegation
- title: Example Apps
- title: References
usePageTitles: true
pages:
- path: /examples/todos
- path: /examples/heystack
- path: /examples/public-registry
- path: /examples/angular
- path: /references/authentication
- path: /references/bns
- path: /references/gaia
- title: Stacks.js References
- title: Community
usePageTitles: true
pages:
- external:
href: 'https://github.com/blockstack/connect#readme'
title: connect
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/auth.html'
title: auth
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/storage.html'
title: storage
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/transactions.html'
title: transactions
- external:
href: 'https://blockstack.github.io/stacks-blockchain-api/client/index.html'
title: blockchain-api-client
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/stacking.html'
title: 'stacking'
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/keychain.html'
title: 'keychain'
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/network.html'
title: 'network'
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/encryption.html'
title: 'encryption'
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/profile.html'
title: 'profile'
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/common.html'
title: common
href: 'https://github.com/friedger/awesome-stacks-chain'
title: Awesome Stacks
- external:
href: 'https://stacks-js-git-master-blockstack.vercel.app/modules/bns.html'
title: bns
href: 'https://docs.hiro.so/example-apps'
title: Hiro example apps
- title: Protocols
usePageTitles: true
pages:
- path: /references/bns
- path: /references/gaia
- path: /start-mining
pages:
- path: /mainnet
- path: /testnet
- path: /storage-hubs
pages:
- path: /overview
sections:
- title: Tutorials
usePageTitles: true
pages:
- path: /amazon-ec2-deploy
- path: /references
pages:
- path: /stacks-cli
- path: /faqs
- path: /glossary
- path: /deploy-tips
- external:
href: 'https://blockstack.github.io/stacks-blockchain-api/'
title: Stacks Blockchain API
@ -143,9 +76,6 @@ sections:
- external:
href: 'https://blockstack.github.io/stacks.js/'
title: Stacks.js
- path: /faqs
- path: /glossary
- path: /deploy-tips
sections:
- title: Contracts
usePageTitles: true

20
src/pages/404.md

@ -5,7 +5,11 @@ description: The page you're looking for isn't here.
## Whoops
Looks like the page you are looking for isn't here. Try out some of these popular pages:
Looks like the page you are looking for isn't here.
-> Developer content has recently moved to [docs.hiro.so](https://docs.hiro.so/). For more information on the content move, see [this post](https://forum.stacks.org/t/the-evolution-of-the-stacks-documentation-and-a-new-hiro-docs-site/12343) on the Stacks forum. Check for your content at [docs.hiro.so](https://docs.hiro.so/) or ask in Discord if you believe you have reached this page in error.
Try out some of these popular pages:
## Understand Stacks
@ -15,19 +19,9 @@ Looks like the page you are looking for isn't here. Try out some of these popula
## Write smart contracts
[@page-reference | grid]
| /write-smart-contracts/overview, /write-smart-contracts/hello-world-tutorial
## Build apps
[@page-reference | grid]
| /build-apps/guides/authentication, /build-apps/guides/transaction-signing, /build-apps/guides/data-storage
| /write-smart-contracts/overview
## Start mining
[@page-reference | grid]
| /start-mining/mainnet, /start-mining/testnet, /understand-stacks/running-mainnet-node, /understand-stacks/running-testnet-node, /understand-stacks/running-regtest-node
## Ecosystem
[@page-reference | grid-small]
| /ecosystem/overview, /ecosystem/stacks-token, /ecosystem/contributing
| /start-mining/mainnet, /start-mining/testnet

225
src/pages/build-apps/collections/overview.md

@ -1,225 +0,0 @@
---
title: Overview
description: Store data in standardized formats with Collections
---
~> Collections are an experimental feature not yet recommended for production. Please report issues and contribute through [the blockstack-collections repository](https://github.com/blockstack/blockstack-collections/).
## Introduction
Collections is the feature designed to make data portable among Stacks applications. Sharing is accomplished by
storing a user's data in a standardized format at a known, Gaia storage location. Collections associate user data with
a user's decentralized ID. When users move among apps, the same data is available to each application the user authorizes.
On this page, you learn what collections are and how to use them. You'll learn about the `Contact` collection in
particular. The following topics are covered:
~> If you encounter problems with `blockstack.js` you can [file issues or request enhancements on its repo](https://github.com/blockstack/blockstack.js/issues/new).
## Understand how collections work
One of the goals of the Stacks ecosystem is to give users true data ownership by enabling _data portability_. Data portability allows
users to login with their digital ID on any app and have access to the same data. For example, if a user adds a photo of
a Hawaiian vacation in one app, that photo enters the user's data pool. Then, when the user opens a second app, that
same photo is available to the second app because the user data, including the photo, is shared via the user's
decentralized ID.
How do collections work? Stacks provides a library containing commonly used data schemes. Developers use these classes
and objects instead of creating their own, unique data schemes. Using a class from the collections library guarantees
class data is stored in Gaia in that format; And, when retrieved, guarantees the same format is returned. This
pre-release provides the `Contact` collection. A contact schema produces this structure:
```json
{
"lastName": "jeffries",
"firstName": "sally",
"blockstackID": "",
"email": "",
"website": "",
"telephone": "",
"identifier": "sally jeffries"
}
```
A collection schema is neither validated or enforced. The goal is to incentivize collection use rather that enforce use.
Because malicious apps or apps with poor security controls may damage user data, Stacks collections should
include the ability for users to roll-back changes. For this reason, Stacks supports an event log and rollback
mechanisms in collections. To support this rollback in the pre-release, collections data store is conceptually an event
log. Every data write an app makes is stored as a separate file. By placing data in files it ensures that data is never
lost and files can be returned back to any previous state.
##### The Future of Collections Envisioned
Stacks collections should enable true data portability across applications for each decentralized ID.
The goal is to develop simple user interfaces to allow users to manage of application access and permissions to collection
data. For example, in the future, users can rollback data to previous versions using management interfaces.
For developers, collections can incentivize user adoption by reducing user friction. Users can easily try new apps and
move to them without the overhead or barrier of re-entering data. You are [welcome to review and comment](https://forum.blockstack.org/t/feedback-wanted-collections-design/7752)
on the current design document.
## Build a Contact Manager demo app
Before adding collections to your DApp, you can try it for yourself using the Contact Manager demo application. Blockstack Contacts is a simple contacts manager that allows users to add and manage their contacts. The data stored by this app can be used in another app that receives the contacts collection permissions.
The tutorial relies on the `npm` dependency manager. Before you begin, verify
you have installed `npm` using the `which` command to verify.
```bash
which npm
```
```bash
/usr/local/bin/npm
```
If you have `npm` installed, do the following to run the Contact Manager demo app:
1. If you have a local Blockstack installed, uninstall it.
2. Download and install the [Collections Alpha Build](https://github.com/blockstack/blockstack-browser/releases/tag/collections-alpha.1) of the Blockstack Browser client for your OS.
3. Launch the alpha build of the local Blockstack Browser client.
4. In your Internet browser, visit the [https://github.com/yknl/blockstack-contacts](https://github.com/yknl/blockstack-contacts) repository.
5. Download or clone the repository code to you local workstation.
6. In your workstation terminal, change directory where you downloaded the demo code.
7. Install the dependencies using `npm`.
```bash
npm install
```
8. Start the application running.
```bash
npm run start
```
The system starts the application and launches it in your browser at 127.0.0.1:3000
9. Choose **Sign In with Stacks Auth**.
The internet browser will display this pop-up
![](/images/contacts-manager.png)
10. Use the local browser by choosing **Open Blockstack.app**.
11. If you are not signed into an ID in the Blockstack Browser, choose **Create new ID** from the pop up.
If you are already signed in, choose an ID to sign in to the Contacts Manager app with.
The system should return you to the Contact Manager demo application.
### Test Contact data portability
1. Add a contact using your new Contact Manager application, the contact added here is `Josephine Baker`.
When you have successfully created a contact, the Contact Manager displays that contact on the list. Here you can see that Josephine Baker was entered as a contact.
![](/images/added-contact.png)
2. Open the [collections page test](https://blockstack.github.io/blockstack-collections/page_test/) in your browser.
The page test is an entirely different application that also makes use of the Contacts collection.
3. Sign in using the same Blockstack ID you used to sign into the Contacts Manager.
4. Choose **List contacts**.
![](/images/test-contact.png)
## How to add the Contact collections to your DApp
In this section, you learn how to add `Contact` collection functionality to an existing application. Before beginning, make sure your application is using Stacks auth and is storing data with Gaia. To start using the `Contact` collection in your Stacks app, do the following:
1. Change to the root directory of your app project.
2. Install the preview branch of the `blockstack.js`.
```
npm install blockstack@20.0.0-alpha.5
```
3. Add the ``blockstack-collections` package to your app.
```
npm install blockstack-collections@0.1.8
```
4. Edit your code to import the `Contact` collection type.
```
import { Contact } from `blockstack-collections`
```
5. Customize your sign in request to include the contacts collection scope `Contact.scope`.
This scope grants your app permission to read and write to the user’s `Contact` collection.
```jsx
import { UserSession, AppConfig, makeAuthRequest } from 'blockstack';
import { Contact } from '`blockstack-collections';
const scopes = ['store_write', 'publish_data', Contact.scope];
const appConfig = new AppConfig(scopes);
const userSession = new UserSession({ appConfig: appConfig });
userSession.redirectToSignIn();
```
## Collection storage operations
Collection storage was designed around an ORM-like interface. This approach ensures that you’ll be working with typed objects instead of the `getFile`, `putFile` functions provided by blockstack.js.
### Example: Create and save a Contact object
```jsx
const newContact = {
lastName: 'Stackerson',
firstName: 'Blocky',
blockstackID: 'Blockstacker.id',
email: 'blockstacker@blockstack.org',
website: 'blockstack.org',
telephone: '123123123',
};
var contact = new Contact(newContact);
contact.save().then(contactID => {
// contact saved successfully
});
```
### Example: Read a Contact object
```jsx
let contactID = 'Blocky Stackerson';
Contact.get(contactID).then(contact => {
// Do something with the contact object
console.log('Hello ${contact.firstName}');
});
```
### Example: List Contact objects
```jsx
let contacts = [];
Contact.list(contactID => {
// This callback is invoked for each contact identifier
// To get the actual object you'll need to use Contact.get
// Or you can add the IDs to an array for display
contacts.push(contactID);
// Return true to continue iterating, return false to stop
return true;
});
```
### Example: Delete a Contact
```jsx
var contact = new Contact(newContact);
contact.delete().then(() => {
// contact deleted successfully
});
```

157
src/pages/build-apps/collections/types.md

@ -1,157 +0,0 @@
---
title: Types
description: Create new collection types
---
~> Collections are an experimental feature not yet recommended for production. Please report issues and contribute through [the blockstack-collections repository](https://github.com/blockstack/blockstack-collections/).
## Introduction
Collections support data portability between applications. Stacks supplies a `Contact` collection for use by Stacks applications. Developers can create additional collection types, use them in their own applications, and publish them so other developers can make use of them too.
In this section, you learn the coding guidelines for creating and publishing a new `Collection` type. The following topics are included:
## Before you begin
New collections rely on the `blockstack-collections` package. Before you code, make sure you have installed this package and it is available to your project.
```bash
npm install -g blockstack-collections
```
You should also familiarize yourself with the [Collection](https://github.com/blockstack/blockstack-collections/blob/master/src/types/collection.ts) class and review [the existing Collection types](https://github.com/blockstack/blockstack-collections/tree/master/src/types). Keep in mind, someone else may have already added a custom type similar to what you want to add.
Collection types can be written in `.js` JavaScript or `.ts` (TypeScript) files. TypeScript is a typed superset of JavaScript, you can [read the language documentation](https://www.typescriptlang.org/) to learn more.
## Essential steps for creating a Collection type
This section demonstrates how to create a new collection type using TypeScript. While this is written in TypeScript, the steps in JavaScript are the same. Follow these steps to create a new collection type:
1. Create a new `.ts` file and open it for editing.
2. Import the `Collection` class.
```js
import { Collection, Attrs, Serializable } from 'blockstack-collections';
```
3. Extend the abstract `Collection` class from the `blockstack-collections` package.
```js
export class Contact extends Collection implements Serializable {
...
}
```
4. Give your `Collection` a unique identifier.
The Stacks Collection frameworks uses this identifier to place Collection data into a corresponding Gaia storage bucket.
```js
static get collectionName(): string {
return 'contact'
}
```
!> While you must specify a unique identifier, the Stacks platform does not currently enforce uniqueness. If your `Collection` type shares the same identifier as another type, it will lead to data corruption for the user. In the future, the Stacks platform will enforce unique collection names.
5. Define a static `schema` constant.
This is your type's schema.
```jsx
static schema = {
identifier: String,
firstName: String,
lastName: String,
blockstackID: String,
email: String,
website: String,
address: String,
telephone: String,
organization: String
}
```
6. Determine if you need to set the `singleFile` storage flag.
By default, the `singleFile` flag is false. This setting causes every record in a collection to store in Gaia as a separate file. The default works well for larger types that describe data such as documents or photos. If your `Collection` type only has a few fields and is not expected to have a large number of records, set the `singleFile` data format flag to `true`.
```jsx
static singleFile = true
```
7. Define the `fromObject` and `fromData` serializaiton methods.
These methods serialize and deserialize your `Collection` type. You can use any serialization method you want. Data encryption is handled automatically by the parent `Collection` class, so you _should not_ perform any additional encryption.
In the following example code, data is converted to JSON string for storage.
```jsx
static fromObject(object: object) {
// Create from plain JavaScript object
return new Contact(object)
}
static fromData(data: string) {
// Deserialize JSON data
return new Contact(JSON.parse(data))
}
serialize() {
// Serialize to JSON string
return JSON.stringify(this.attrs)
}
```
8. Test and iterate development of your type in your application.
9. Publish your type for others to use.
## Add a listener (optional)
If you need to listen for changes to any of the object’s attributes, you can implement the `onValueChange` method. For example, in the `Contacts` Collection type, when the contact is renamed, the unique identifier for the object needs to be updated.
```jsx
onValueChange(key: string, value: any) {
if (key === 'firstName') {
this.previousIdentifier = this.attrs.identifier
this.attrs.identifier = this.constructIdentifier()
this.identifierChanged = true
}
else if (key === 'lastName') {
this.previousIdentifier = this.attrs.identifier
this.attrs.identifier = this.constructIdentifier()
this.identifierChanged = true
}
}
```
## Override processing methods (optional)
To perform additional processing of a collection, you can override the `get`, `save`, `list` and `delete` methods. For example, in the `Contact` type, the `save` method is overridden to also perform a `delete` if a contact is renamed. Deletion is necessary because identifiers for a `Contact` are generated from the contact name. And data stored under the previous identifier must be deleted after writing to a new identifier.
```jsx
async save(userSession?: UserSession) {
// Delete old file on save if object identifier changes
return super.save(userSession)
.then((result) => {
if (this.identifierChanged) {
return Contact.delete(this.previousIdentifier, userSession)
.then(() => {
this.identifierChanged = false
return result
})
} else {
return result
}
})
}
```
## Publish your new type for others to use
While you _can_ use your collection exclusively in your application, the Collections feature is intended to enable data portability between DApps. So, you should publish your new type so other developers can make use of it.
To publish your Collection type, do the following:
1. Clone or fork the [blockstack-collections](https://github.com/blockstack/blockstack-collections) repo.
2. Add your new type file to the `src/types` subdirectory.
3. Create a pull request back to the `blockstack-collection` repository.

216
src/pages/build-apps/examples/angular.md

@ -1,216 +0,0 @@
---
title: Angular app
description: How to integrate authentication into an Angular app
experience: beginners
duration: 30 minutes
tags:
- tutorial
images:
large: /images/pages/hello-world.svg
sm: /images/pages/hello-world-sm.svg
---
# Building an with Angular
## Getting started with Angular
In this tutorial, you'll learn how to work with Stacks Connect when using [Angular](https://angular.io/) as your framework of choice. It builds on what you've learnt in the [Authentication Overview](/build-apps/guides/authentication).
-> This article presumes some familiarity with [Angular](https://angular.io/), as well as [Reactive Extensions (RxJS)](https://rxjs.dev/).
### Prerequisites
We'll be using the [Angular CLI](https://cli.angular.io/) to scaffold the project, so make sure you've got the latest version installed. We're using version `10.2.0`.
```sh
npm install --global @angular/cli
```
## 1. Scaffold & Run
Use the `ng new` command to scaffold a new project. We've named ours `ng-stacks-connect`.
```sh
ng new --minimal --inline-style --inline-template
```
You'll be asked to enter some preferences: whether your app with use routing, and whether you want to use a CSS preprocessor like SASS. For sake of this tutorial, we're keeping things simple. No routing. No preprocessing.
Inside the newly created `ng-stacks-connect` directory, let's boot up the development server which defaults to [localhost:4200](http://localhost:4200).
```sh
cd ng-stacks-connect
ng serve
```
## 2. Add Stacks Connect
```sh
npm install --save @stacks/connect blockstack
```
-> We're also installing the `blockstack` package, as it's a [peer dependency](https://docs.npmjs.com/cli/v7/configuring-npm/package-json#peerdependencies) of Stacks Connect
## 3. Declare missing globals
Some dependencies of these packages were written for a Nodejs environment. In a browser environment, tools such as Webpack (v4) often abstract the polyfilling of Nodejs specific APIs. Using the Angular CLI, this must be done manually.
-> `Buffer`, for example, is a global class in a Nodejs environment. In the browser is it `undefined` so we must declare it to avoid runtime exceptions
Add the following snippet to your `src/polyfills.ts`
```typescript
(window as any).global = window;
(window as any).process = {
version: '',
env: {},
};
global.Buffer = require('buffer').Buffer;
```
This does 3 things:
1. Declares `global` to `window`
2. Declares a global `Buffer` class
3. Declares a global `process` object
## 4. Authentication flow
Now everything's set up, we're ready to create our auth components
We can use the CLI's generator to scaffold components.
### 4.1 Sign In button
```sh
ng generate component
```
Enter the name: `stacks-sign-in-button`. You'll find the newly generated component in `src/app/stacks-sign-in-button/stacks-sign-in-button.component.ts`.
Here's our Sign In button component. Let's replace this example with the following code.
```typescript
import { Component, OnInit, Output, EventEmitter } from '@angular/core';
@Component({
selector: 'app-stacks-sign-in-button',
template: ` <button (click)="onSignIn.emit()">Sign In</button> `,
})
export class StacksSignInButtonComponent {
@Output() onSignIn = new EventEmitter();
}
```
### 4.2 Connecting Stacks Connect
Let's add this button to our `app-root` component (`app.component.ts`) and wire up the `(onSignIn)` event. Make sure to import `Subject` from `rxjs`.
```typescript
@Component({
selector: 'app-root',
template: `<app-stacks-sign-in-button
(onSignIn)="stacksAuth$.next()"
></app-stacks-sign-in-button>`,
})
export class AppComponent {
stacksAuth$ = new Subject<void>();
}
```
Here we're using an Rxjs `Subject` to represent a stream of sign in events. `stacksAuth$` will emit when we should trigger the sign in action.
### 4.3 Authentication
First, describe the auth options we need to pass to Connect. [Learn more about `AuthOptions` here](/build-apps/guides/authentication). Let's modify the default component to look like this:
```typescript
import { Component } from '@angular/core';
import { AuthOptions, FinishedData } from '@stacks/connect';
import { ReplaySubject, Subject } from 'rxjs';
import { switchMap } from 'rxjs/operators';
@Component({
selector: 'app-root',
template: `
<app-stacks-sign-in-button (onSignIn)="stacksAuth$.next()"></app-stacks-sign-in-button>
<code>
<pre>{{ authResponse$ | async | json }}</pre>
</code>
`,
})
export class AppComponent {
stacksAuth$ = new Subject<void>();
authResponse$ = new ReplaySubject<FinishedData>(1);
authOptions: AuthOptions = {
finished: response => this.authResponse$.next(response),
appDetails: { name: 'Angular Stacks Connect Demo', icon: 'http://placekitten.com/g/100/100' },
};
ngOnInit() {
this.stacksAuth$
.pipe(switchMap(() => import('@stacks/connect')))
.subscribe(connectLibrary => connectLibrary.showBlockstackConnect(this.authOptions));
}
}
```
Let's run through what's going on. In the `authOptions` field, we're using the `finished` handler to emit a value to the `authResponse$` which uses a `ReplaySubject` to persist the latest response.
-> A [`ReplaySubject`](https://rxjs.dev/api/index/class/ReplaySubject) is an Observable that starts without an initial value, but replays the latest x emissions when subscribed to
For initial load performance, we're using `import("@stacks/connect")` to only load the Stacks Connect library when it's needed. The `switchMap` operators "switches" out the `stacksAuth$` event for the library.
The output of `authResponse$` can be added to the template for debugging purposes. This uses Angular's `async` and `json` pipes.
### 4.3 Loading text
One problem with the current implementation is that there's a network delay while waiting to load the Connect library. Let's keep track of the loading state and display some text in the sign in button component. You'll need to `import { tap, switchMap } from 'rxjs/operators';`.
```typescript
// src/app/app.component.ts
isLoadingConnect$ = new BehaviorSubject(false);
ngOnInit() {
this.stacksAuth$
.pipe(
tap(() => this.isLoadingConnect$.next(true)),
switchMap(() => import("@stacks/connect")),
tap(() => this.isLoadingConnect$.next(false))
)
.subscribe(connectLibrary =>
connectLibrary.showBlockstackConnect(this.authOptions)
);
}
```
We can keep track of it with a [BehaviorSubject](https://rxjs.dev/api/index/class/BehaviorSubject), which always emits its initial value when subscribed to.
Let's add a `loading` input to the `StacksSignInButtonComponent` component.
```typescript highlight=3,6
@Component({
selector: 'app-stacks-sign-in-button',
template: ` <button (click)="onSignIn.emit()">{{ loading ? 'Loading' : 'Sign in' }}</button> `,
})
export class StacksSignInButtonComponent {
@Input() loading: boolean;
@Output() onSignIn = new EventEmitter();
}
```
Then, pass the `isLoadingConnect$` Observable into the component, and hide it when the user has already authenticated.
```html
// Edit src/app/app.component.ts
<app-stacks-sign-in-button
*ngIf="!(authResponse$ | async)"
(onSignIn)="stacksAuth$.next()"
[loading]="isLoadingConnect$ | async"
></app-stacks-sign-in-button>
```
### Next steps
This tutorial has shown you how to integrate Stacks Connect with an Angular application. You may want to consider abstracting the Stacks Connect logic behind an [Angular service](https://angular.io/guide/architecture-services), or using [Material Design](https://material.angular.io/) to theme your application.

637
src/pages/build-apps/examples/heystack.md

@ -1,637 +0,0 @@
---
title: Heystack app
description: Interacting with the wallet and smart contracts from a React application
tags:
- example-app
images:
large: /images/pages/heystack-app.svg
---
## Introduction
This example application demonstrates important features of the Stacks blockchain, and is a case study for how a frontend
web application can interact with a Clarity smart contract. The full source of the application is provided and
completely open source for you to use or modify. This page highlights important code snippets and design patterns to
help you learn how to develop your own Stacks application.
This app showcases the following platform features:
- Authenticating users with the web wallet
- Using a smart contract to store data on the blockchain
- Minting new fungible tokens with a [SIP-010][] compliant smart contract
- Creating and monitoring transactions on the Stacks blockchain using [Stacks.js][]
You can access the [online version of the Heystack app][heystack] to interact with it. The source for Heystack is also
available on [Github][heystack_gh]. This page assumes some familiarity with [React][].
## Heystack overview
Heystack is a web application for chatting with other Stacks users. The application uses the [Stacks web wallet][] to
authenticate users in the frontend. When a user logs in to Heystack, they're given a genesis amount of $HEY fungible
tokens, which allows them to send and like messages on the platform.
Heystack is powered by Clarity smart contracts so each message is a transaction on the Stacks blockchain. Each time a
user sends a message on the platform, they must sign the message with the [Stacks web wallet][] (or another compatible
wallet) and pay a small gas fee in STX. A user spends a $HEY token to send every message, and recieves a $HEY token for
every like that their messages receive.
The following video provides a brief overview of the Heystack application:
<br /><iframe width="560" height="315" src="https://www.youtube.com/embed/2_xAIctJqGw" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Review smart contracts
Heystack depends on two smart contracts to execute the backend functions of the app on the Stacks blockchain:
- a contract for handling the messaging content
- a contract for minting and distributing the $HEY token
As a best practice, these two contracts are separate because of the different functionality they handle. This is an
exercise in [separation of concerns][].
### Content contract
The `hey.clar` contract provides two primary functions for the application, one to publish content to
the blockchain and another to like a piece of content based on its ID. This section reviews the implementation of
these primary functions, but is not a comprehensive discussion of the contract.
In order to accomplish the two primary functions, the contract relies on a data variable `content-index` and two
[data maps][], `like-state` and `publisher-state` which contain the number of likes a piece of content has received, and
the principal address of the account that published the content.
Note that all variables are defined at the top of the contract, which is a requirement of the Clarity language. These
include constants such as the `contract-creator`, error codes, and a treasury address.
```clarity
;;
;; Data maps and vars
(define-data-var content-index uint u0)
(define-read-only (get-content-index)
(ok (var-get content-index))
)
(define-map like-state
{ content-index: uint }
{ likes: uint }
)
(define-map publisher-state
{ content-index: uint }
{ publisher: principal }
)
```
Read-only functions provide a method for getting the like count of a piece of content, and getting the principal address
of the message publisher.
```clarity
(define-read-only (get-like-count (id uint))
;; Checks map for like count of given id
;; defaults to 0 likes if no entry found
(ok (default-to { likes: u0 } (map-get? like-state { content-index: id })))
)
(define-read-only (get-message-publisher (id uint))
;; Checks map for like count of given id
;; defaults to 0 likes if no entry found
(ok (unwrap-panic (get publisher (map-get? publisher-state { content-index: id }))))
```
The `get-like-count` method accepts a content ID and returns the number of likes associated with that content. The
method uses the [`default-to`][] function to return `0` if the content ID isn't found in the map of likes.
The `get-message-publisher` method accepts a content ID and returns the principal address of the content publisher. The
method uses the [`unwrap-panic`][] function to halt execution of the method if the principal address isn't found in
the map of publishers.
The two primary public methods are the `send-message` and `like-message` functions. These methods allow the contract
caller to store a message on the blockchain (creating entries in the data maps for the message sender and the number
of likes). Note that the message itself isn't stored in a contract variable, the frontend application reads the content
of the message directly from the transaction on the blockchain.
```clarity
;;
;; Public functions
(define-public (send-message (content (string-utf8 140)))
(let ((id (unwrap! (increment-content-index) (err u0))))
(print { content: content, publisher: tx-sender, index: id })
(map-set like-state
{ content-index: id }
{ likes: u0 }
)
(map-set publisher-state
{ content-index: id }
{ publisher: tx-sender }
)
(transfer-hey u1 HEY_TREASURY)
)
)
```
The `send-message` method accepts a utf-8 string with a maximum length of 140 characters. The method defines an internal
variable `id` using the `let` function and assigns the next content ID to that variable by calling the
`increment-contract-index` method of the contract. The value assignment of this variable is bound by the [`unwrap!`][]
function, which returns an error and exits the control-flow if the `increment-contract-index` function isn't
successfully called.
The method then assigns `u0` likes to the content in the `like-state` data map, and adds the principal address to the
`publisher-state` data map using the [`map-set`][] function. Finally, the private method `transfer-hey` is called to
transfer 1 $HEY token from the message sender to the $HEY treasury address stored in the `HEY_TREASURY` constant.
```clarity
(define-public (like-message (id uint))
(begin
;; cannot like content that doesn't exist
(asserts! (>= (var-get content-index) id) (err ERR_CANNOT_LIKE_NON_EXISTENT_CONTENT))
;; transfer 1 HEY to the principal that created the content
(map-set like-state
{ content-index: id }
{ likes: (+ u1 (get likes (unwrap! (get-like-count id) (err u0)))) }
)
(transfer-hey u1 (unwrap-panic (get-message-publisher id)))
)
)
```
The `like-message` method accepts a content ID. The method checks that the ID is lower than the current content ID using
the [`asserts!`][] function, to verify that the provided ID is a valid ID. If the [`asserts!`][] assessment is `false`,
the method returns an error code. If the ID is valid, the method performs a [`map-set`][] to look up the content in the
`like-state` data map and add a like to the value stored in the map. Once again, the [`unwrap!`][] function is used to
ensure that an invalid value isn't stored in the map.
The `hey.clar` contract provides some additional functions for working with the $HEY token contract, discussed in the
next section.
### Token contract
Heystack creates a native fungible token for use in the application. When a user authenticates with Heystack, they're
automatically eligible to claim 100 $HEY tokens to allow them to start messaging.
[SIP-010][] defines the fungible token standard on Stacks, which allows Stacks compatible wallets to handle fungible
tokens through a set of standardized methods. SIP-010 defines 7 traits that a fungible token contract must have in order
to be compliant:
- `transfer`: method for transferring the token from one principal to another
- `get-name`: returns the human-readable name of the token
- `get-symbol`: returns the ticker symbol of the token
- `get-decimals`: returns number of decimal places in the token
- `get-balance`: return the balance of a given principal
- `get-total-supply`: returns the total supply of the token
- `get-token-uri`: returns an optional string that resolves to a valid URI for the token's metadata.
In Clarity, a contract can declare that it intends to implement a set of standard traits.
```clarity
;; Implement the `ft-trait` trait defined in the `ft-trait` contract
;; https://github.com/hstove/stacks-fungible-token
(impl-trait 'ST3J2GVMMM2R07ZFBJDWTYEYAR8FZH5WKDTFJ9AHA.ft-trait.sip-010-trait)
```
The [`impl-trait`][] function asserts that the smart contract is fully implementing a given set of traits defined by the
argument. Like variable definitions, `impl-trait` must be declared at the top of a smart contract definition.
-> The contract address for SIP-010 trait definition is different depending on which network (mainnet, testnet, etc.)
your contract is deployed on. See the standard for the current addresses of the standard traits.
The `hey-token.clar` contract implements the required 7 traits of [SIP-010][], and one additional method, the
`gift-tokens` method, that allows a principal to request tokens from the contract.
```clarity
(define-public (gift-tokens (recipient principal))
(begin
(asserts! (is-eq tx-sender recipient) (err u0))
(ft-mint? hey-token u1 recipient)
)
)
```
## Authentication
Authentication is handled through the [`@stacks/connect-react`][] and [`@stacks/auth`][] packages, which interact with
compatible Stacks wallet extensions and provide methods for interacting with a user session respectively. [Jotai][]
provides application state management.
The [connect wallet button component][] implements the interface with the Stacks web wallet through the
[`@stacks/connect-react`][] package.
```tsx
import { Button } from '@components/button';
import React from 'react';
import { useConnect } from '@stacks/connect-react';
import { ButtonProps } from '@stacks/ui';
import { useLoading } from '@hooks/use-loading';
import { LOADING_KEYS } from '@store/ui';
export const ConnectWalletButton: React.FC<ButtonProps> = props => {
const { doOpenAuth } = useConnect();
const { isLoading, setIsLoading } = useLoading(LOADING_KEYS.AUTH);
return (
<Button
isLoading={isLoading}
onClick={() => {
void setIsLoading(true);
doOpenAuth();
}}
{...props}
>
Connect wallet
</Button>
);
};
```
Once connected, [`/src/store/auth.ts`][] populates the user session data into the Jotai store, allowing the application
to access the user information.
You can see in the [welcome panel component][] how the presence or absence of stored user data is used to display the
wallet connect button or the signed in view.
```tsx
...
const UserSection = memo((props: StackProps) => {
const { user } = useUser();
return (
<Stack
alignItems="center"
justifyContent="center"
flexGrow={1}
spacing="loose"
textAlign="center"
{...props}
>
{!user ? <SignedOutView /> : <SignedInView onClick={() => console.log('click')} />}
</Stack>
);
});
...
```
### Token faucet
The `use-claim-hey.ts` file provides a React hook for interacting with the token faucet of the Clarity smart contract.
```ts
import { useLoading } from '@hooks/use-loading';
import { LOADING_KEYS } from '@store/ui';
import { useConnect } from '@stacks/connect-react';
import { useNetwork } from '@hooks/use-network';
import { useCallback } from 'react';
import { useHeyContract } from '@hooks/use-hey-contract';
import { REQUEST_FUNCTION } from '@common/constants';
import { principalCV } from '@stacks/transactions/dist/clarity/types/principalCV';
import { useCurrentAddress } from '@hooks/use-current-address';
export function useHandleClaimHey() {
const address = useCurrentAddress();
const { setIsLoading } = useLoading(LOADING_KEYS.CLAIM_HEY);
const { doContractCall } = useConnect();
const [contractAddress, contractName] = useHeyContract();
const network = useNetwork();
const onFinish = useCallback(() => {
void setIsLoading(false);
}, [setIsLoading]);
const onCancel = useCallback(() => {
void setIsLoading(false);
}, [setIsLoading]);
return useCallback(() => {
void setIsLoading(true);
void doContractCall({
contractAddress,
contractName,
functionName: REQUEST_FUNCTION,
functionArgs: [principalCV(address)],
onFinish,
onCancel,
network,
stxAddress: address,
});
}, [setIsLoading, onFinish, network, onCancel, address, doContractCall]);
}
```
The [`@stacks/connect-react`][] package exports the `doContractCall` method, which interacts with the smart contract on
the blockchain. There are more examples of transaction calls in the next section. It's important to note that it's
necessary to convert Javascript types to Clarity types using the types exported by the [`@stacks/transactions`][] package.
Further discussion of this conversion is in the [Clarity types in Javascript][] section.
## Transactions
Since messages in Heystack are transactions against a Clarity smart contract, the application must be able to create
transactions and read their content from the blockchain. The following sections highlight code snippets that perform
Clarity transactions and read both completed and pending transactions from the Stacks blockchain.
### Issuing transactions
The two primary functions of the `hey.clar` smart contract are publishing a message and accepting a like on an already
published message. The [`src/hooks/use-publish-hey.ts`][] file implements the frontend method for calling the smart
contract on the blockchain with the appropriate values.
```ts
...
return useCallback(
(content: string, _onFinish: () => void) => {
void setShowPendingOverlay(true);
void setIsLoading(true);
void doContractCall({
contractAddress,
contractName,
functionName: MESSAGE_FUNCTION,
functionArgs: [
stringUtf8CV(content),
attachmentUri !== '' ? someCV(stringUtf8CV(attachmentUri)) : noneCV(),
],
onFinish: () => {
_onFinish();
onFinish();
},
postConditions: [
createFungiblePostCondition(
address,
FungibleConditionCode.Equal,
new BN(1),
createAssetInfo(contractAddress, 'hey-token', 'hey-token')
),
],
onCancel,
network,
stxAddress: address,
});
},
[setIsLoading, onFinish, network, onCancel, address, doContractCall]
);
...
```
The frontend uses the `doContractCall` function from the [`@stacks/connect-react`][] package to perform the call to the
Clarity smart contract. In order to support the mapping of [Javascript types to Clarity types][], helpers exported from
the [`@stacks/transactions`][] package are used as arguments to the contract call.
Note that the contract call also create post conditions to verify that a single $HEY token is transferred by the
execution of the contract call. Post conditions are a powerful feature of Clarity that can be used to prevent
rug-pulling and other detrimental behavior by smart contracts.
### Reading transactions
Heystack achieves pseudo-real-time messaging by reading both confirmed and pending transactions from the blockchain.
Pending transactions are read from the mempool, whereas confirmed transactions are read directly from the chain. The
[`src/store/hey.ts`][] file contains the implementation of both.
```ts
...
export const heyTransactionsAtom = atomWithQuery<ContractCallTransaction[], string>(get => ({
queryKey: ['hey-txs'],
...(defaultOptions as any),
refetchInterval: 500,
queryFn: async (): Promise<ContractCallTransaction[]> => {
const client = get(accountsClientAtom);
const txClient = get(transactionsClientAtom);
const txs = await client.getAccountTransactions({
limit: 50,
principal: HEY_CONTRACT,
});
const txids = (txs as TransactionResults).results
.filter(
tx =>
tx.tx_type === 'contract_call' &&
tx.contract_call.function_name === MESSAGE_FUNCTION &&
tx.tx_status === 'success'
)
.map(tx => tx.tx_id);
const final = await Promise.all(txids.map(async txId => txClient.getTransactionById({ txId })));
return final as ContractCallTransaction[];
},
}));
...
```
The `getAccountTransactions` from the `AccountsApi` object exported by [`@stacks/blockchain-api-client`][] is used to
read confirmed blockchain transactions against the `hey.clar` contract from the Stacks API. The list of transactions
returned by the API is filtered to only transactions representing a call to the message function that was successful,
and then mapped to an array of transaction IDs.
Finally, the array of IDs is used to read each full transaction from the blockchain using the `getTransactionsById`
method from the `TransactionsApi` object exported by the [`@stacks/blockchain-api-client`][] package.
Pending transactions are read from the mempool in a similar implementation.
```ts
export const pendingTxsAtom = atomWithQuery<Heystack[], string>(get => ({
queryKey: ['hey-pending-txs'],
refetchInterval: 1000,
...(defaultOptions as any),
queryFn: async (): Promise<Heystack[]> => {
const client = get(transactionsClientAtom);
const txs = await client.getMempoolTransactionList({ limit: 96 });
const heyTxs = (txs as MempoolTransactionListResponse).results
.filter(
tx =>
tx.tx_type === 'contract_call' &&
tx.contract_call.contract_id === HEY_CONTRACT &&
tx.contract_call.function_name === MESSAGE_FUNCTION &&
tx.tx_status === 'pending'
)
.map(tx => tx.tx_id);
const final = await Promise.all(heyTxs.map(async txId => client.getTransactionById({ txId })));
return (
(final as ContractCallTransaction[]).map(tx => {
const attachment = tx.contract_call.function_args?.[1].repr
.replace(`(some u"`, '')
.slice(0, -1);
return {
sender: tx.sender_address,
content: tx.contract_call.function_args?.[0].repr
.replace(`u"`, '')
.slice(0, -1) as string,
id: tx.tx_id,
attachment: attachment === 'non' ? undefined : attachment,
timestamp: (tx as any).receipt_time,
isPending: true,
};
}) || []
);
},
}));
```
Pending transactions are read from the mempool using the `getMempoolTransactionList` method from the `TransactionsApi`
exported by [`@stacks/blockchain-api-client`][]. Similar to confirmed transactions, the returned array is filtered to
a list of IDs, and then used to generate an array of full transactions.
Because of differences in the data structure of the pending transactions vs. confirmed transactions, the pending
transaction list must be standardized before being returned.
Note that for the low stakes of a messaging app, pending transactions can be treated as likely permanent state
transitions. For applications implementing higher stakes business logic (such as the transfer of representations
of value) it would be more appropriate to wait to display confirmed transactions.
### Clarity types in Javascript
In order to create transactions to call functions in Clarity contracts, the [`@stacks/transactions`][] package exports
classes that make it easy to construct well-typed Clarity values in Javascript. According to the Clarity language
specification, Clarity has the following types:
- `(tuple (key-name-0 key-type 0) (key-name-1 key-type-1) ...)` - a typed tuple with named fields
- `(list max-len entry-type)` - a list of maximum length `max-len`, with entries of type `entry-type`
- `(response ok-type err-type)` - object used by public functions to commit their state changes or abort
- `(optional some-type)` - an option type for objects that can be either `(some-value)` or `none`
- `(buff max-len)` - byte buffer of maximum length
- `principal` - object representing a principal address (contract or standard)
- `bool` - boolean value (`true` or `false`)
- `int` - signed 128-bit integer
- `uint` - unsigned 128-bit integer
To support these types in Javascript, [`@stacks/transactions`][] exports the following helpers:
```ts
// construct boolean clarity values
const t = trueCV();
const f = falseCV();
// construct optional clarity values
const nothing = noneCV();
const something = someCV(t);
// construct a buffer clarity value from an existing Buffer
const buffer = Buffer.from('foo');
const bufCV = bufferCV(buffer);
// construct signed and unsigned integer clarity values
const i = intCV(-10);
const u = uintCV(10);
// construct principal clarity values
const address = 'SP2JXKMSH007NPYAQHKJPQMAQYAD90NQGTVJVQ02B';
const contractName = 'contract-name';
const spCV = standardPrincipalCV(address);
const cpCV = contractPrincipalCV(address, contractName);
// construct response clarity values
const errCV = responseErrorCV(trueCV());
const okCV = responseOkCV(falseCV());
// construct tuple clarity values
const tupCV = tupleCV({
a: intCV(1),
b: trueCV(),
c: falseCV(),
});
// construct list clarity values
const l = listCV([trueCV(), falseCV()]);
```
You should use these helpers when calling Clarity contracts with Javascript to avoid failed contract calls due to bad
typing.
## Reading BNS names
An important feature of Stacks is the [Blockchain Naming System][] (BNS). BNS allows users to register a human-readable
identity to their account, that can act as both a username and a web address.
Names registered to a user can be read from a Stacks API endpoint, as demonstrated in [`src/store/names.ts`][].
-> Due to ecosystem limitations, it's currently uncommon for BNS names to be registered on any testnet. For the purpose
of demonstration, Heystack looks for BNS names against the user's mainnet wallet address.
```ts
export const namesAtom = atomFamily((address: string) =>
atom(async get => {
if (!address || address === '') return;
const network = get(mainnetNetworkAtom);
if (!network) return null;
const local = getLocalNames(network.coreApiUrl, address);
if (local) {
const [names, timestamp] = local;
const now = Date.now();
const isStale = now - timestamp > STALE_TIME;
if (!isStale) return names;
}
try {
const names = await fetchNamesByAddress({
networkUrl: network.coreApiUrl,
address,
});
if (names?.length) {
setLocalNames(network.coreApiUrl, address, [names, Date.now()]);
}
return names || [];
} catch (e) {
console.error(e);
return [];
}
})
);
```
In order to reduce network traffic, Heystack also caches names in the browser's local storage.
A common design pattern in Stacks 2.0 apps is to check if a user has a registered BNS name (only 1 name can be tied to
an account) and display that name in the app where appropriate. If the user doesn't own a BNS name, the wallet address
is used as a stand in. Often, the wallet address is truncated to avoid displaying an overly long string.
The account name component in [`src/components/user-area.tsx`][] demonstrates this design pattern:
```tsx
...
const AccountNameComponent = memo(() => {
const { user } = useUser();
const address = useCurrentMainnetAddress();
const names = useAccountNames(address);
const name = names?.[0];
return <Text mb="tight">{name || user?.username || truncateMiddle(address)}</Text>;
});
...
```
## Development walkthrough video
If you would like to learn more about the Heystack application and how it was developed, the following video presents
specific implementation details.
<br /><iframe width="560" height="315" src="https://www.youtube.com/embed/e-IfT5CI-Gw" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
[heystack]: https://heystack.xyz
[stacks.js]: https://github.com/blockstack/stacks.js
[stacks web wallet]: https://www.hiro.so/wallet/install-web
[react]: https://reactjs.org/
[heystack_gh]: https://github.com/blockstack/heystack
[data maps]: /references/language-functions#define-map
[`default-to`]: /references/language-functions#default-to
[`asserts!`]: /references/language-functions#asserts
[`unwrap-panic`]: /references/language-functions#unwrap-panic
[`unwrap!`]: /references/language-functions#unwrap
[`map-set`]: /references/language-functions#map-set
[sip-010]: https://github.com/hstove/sips/blob/feat/sip-10-ft/sips/sip-010/sip-010-fungible-token-standard.md
[`impl-trait`]: /references/language-functions#impl-trait
[`@stacks/connect-react`]: https://github.com/blockstack/connect#readme
[`@stacks/auth`]: https://github.com/blockstack/stacks.js/tree/master/packages/auth
[jotai]: https://github.com/pmndrs/jotai
[connect wallet button component]: https://github.com/blockstack/heystack/blob/main/src/components/connect-wallet-button.tsx
[welcome panel component]: https://github.com/blockstack/heystack/blob/63ce30f4f6de7a9c846fcdba3acbb6c7b82b83e3/src/components/welcome-panel.tsx#L102
[`/src/store/auth.ts`]: https://github.com/blockstack/heystack/blob/main/src/store/auth.ts
[clarity types in javascript]: /build-apps/examples/heystack#clarity-types-in-javascript
[`@stacks/transactions`]: https://github.com/blockstack/stacks.js/tree/master/packages/transactions#constructing-clarity-values
[blockchain naming system]: /build-apps/references/bns
[`src/store/names.ts`]: https://github.com/blockstack/heystack/blob/main/src/store/names.ts
[javascript types to clarity types]: /build-apps/examples/heystack#clarity-types-in-javascript
[`@stacks/blockchain-api-client`]: https://github.com/blockstack/stacks-blockchain-api/tree/master/client
[`src/common/hooks/use-publish-hey.ts`]: https://github.com/blockstack/heystack/blob/main/src/common/hooks/use-publish-hey.ts
[`src/store/hey.ts`]: https://github.com/blockstack/heystack/blob/main/src/store/hey.ts
[`src/components/user-area.tsx`]: https://github.com/blockstack/heystack/blob/22e4e9020f8bbb404e8c1e36f32f000050f90818/src/components/user-area.tsx#L62
[separation of concerns]: https://en.wikipedia.org/wiki/Separation_of_concerns

227
src/pages/build-apps/examples/indexing.md

@ -1,227 +0,0 @@
---
title: Radiks
description: Learn how to setup Radiks with your app
icon: BlockstackIcon
duration: 1 hour
experience: intermediate
tags:
- tutorial
images:
large: /images/pages/radiks.svg
sm: /images/pages/radiks-sm.svg
---
## Introduction
Using Radiks with your application requires a Radiks server and a client application constructed to use the server.
In this article, you learn how to install, setup, and run a pre-packaged Radiks server that connects to MongoDB.
You also learn how to establish your DApp application as a client for that server.
## Task 1: Set up your Radiks server
Radiks-server is a `node.js` application that uses [MongoDB](https://www.mongodb.com/) as an underlying database.
### Install and configure MongoDB
In the future, Radiks-server will support various different databases, but right now, only MongoDB 3.6 or higher is
supported. MongoDB 3.6 and higher contains fixes required for naming patterns in keys.
-> The steps assume you want to install and run the MongoDB software locally on your workstation for testing and
development. If you are deploying for a production application, you will install MongoDB on your application
server or on a server connected to it.
#### Step 1: [Download and install MongoDB 3.6 or higher](https://docs.mongodb.com/manual/administration/install-community/) on your workstation.
You can also install MongoDB using your favorite package manager; for example, Homebrew is recommended for macOS.
If you are testing on a local workstation, you can use a `docker` image instead of installing locally.
#### Step 2: Start the MongoDB service and verify it is running.
#### Step 3: On your MongoDB instance, create a database for your application data.
You can use the [Mongo shell](https://docs.mongodb.com/manual/mongo/) to do this, or you can [install the MongoDB Compass software](https://www.mongodb.com/download-center/compass) to explore and work with MongoDB data.
#### Step 4: Create a username/password combination with `root` privileges on your new database.
### Install and start the Radiks server
The easiest way to run `radiks-server` is to use the pre-packaged `node.js` server.
#### Step 1: Install the `radiks-server` on a workstation or server.
```bash
npm install -g radiks-server
```
Or, if you prefer `yarn`:
```bash
yarn global add radiks-server
```
The default port for Mongodb is `27017`; your instance may be configured differently. By default,
Radiks-server will use `'mongodb://localhost:27017/radiks-server'` as the `MongoDB_URI` value.
This is suitable for local testing, but in production, you'll want to change the hostname and possibly
the database name.
#### Step 2: Start the `radiks-server` in the command line to confirm your installation.
```bash
radiks-server
```
```bash
(node:37750) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option `useUnifiedTopology: true` to the MongoClient constructor.
radiks-server is ready on http://localhost:1260
```
The `radiks-server` defaults to running on port `1260`. To change the default port, specify the `PORT` environment variable in your environment.
-> By default, the server is running at `http://localhost:1260`
#### Step 3: Stop the `radiks` server
Once you confirm it runs and your installation was a success, exit the `radiks-server` process.
## Task 2: Set up your application
You must set up your application to use Radiks. This requires installing the `radiks` client package and then configuring your application to connect to your Radiks server.
### Install the radiks client software
If you are using `blockstack.js` version 18 or earlier, you must use the Radiks version 0.1.\*, otherwise if you're using `blockstack.js` version 19 or higher, use Radiks 0.2.\* .
1. Change directory to the root of you application code.
2. Install the `radiks` client package.
```bash
npm install radiks
```
Or, if you prefer `yarn`:
```bash
yarn add radiks
```
### Configure the MongoDB for your application
#### Step 1: Start the mongo shell application.
```bash
mongo
```
```bash
MongoDB shell version v4.2.0
connecting to: mongodb://127.0.0.1:27017/?compressors=disabled&gssapiServiceName=mongodb
Implicit session: session `{ "id" : UUID("8d43cf80-490d-4cac-8bd6-40eec5c128de") }`
MongoDB server version: 4.2.0
....
To enable free monitoring, run the following command: db.enableFreeMonitoring()
To permanently disable this reminder, run the following command: db.disableFreeMonitoring()
>
```
#### Step 2: Create a new database for your application.
```bash
> show dbs
admin 0.000GB
config 0.000GB
local 0.000GB
> use test1
switched to db test1
> show dbs
admin 0.000GB
config 0.000GB
local 0.000GB
> db.createUser({user: "admin", pwd:"foobar1",roles: ["readWrite","dbAdmin"]});
Successfully added user: `{ "user" : "admin", "roles" : [ "readWrite", "dbAdmin" ] }`
```
#### Step 3: Add a user with administrative rights to the database.
```bash
> db.createUser({user: "admin", pwd:"foobar1",roles: ["readWrite","dbAdmin"]});
Successfully added user: `{ "user" : "admin", "roles" : [ "readWrite", "dbAdmin" ] }`
```
#### Step 4: Create an `MONGODB_URI` environment variable on the same machine where you are running the `radiks-server`.
Use the `mongodb://username:password@host:port/db_name` format for your variable. For example, to set this variable in a `bash` shell:
```bash
export MONGODB_URI="mongodb://admin:foobar1@localhost:27017/test1"
```
## Task 3: Add startup code and build your application
To set up `radiks`, you only need to configure the URL that your Radiks-server instance is running on. If you're using the pre-built Radiks server, this will be `http://localhost:1260`. If you're in production or are using a custom Radiks server, you'll need to specify the exact URL where it's available.
Radiks also is compatible with version 19 of blockstack.js, which requires you to configure a `UserSession` object to handle all user-data-related methods. You'll need to define this and pass it to your Radiks configuration so that Radiks can know how to fetch information about the current logged in user.
### Configure your application to use your `radiks-server`.
To configure your application as a `radiks` client, do the following:
#### Step 1: Start your application so that a `UserSession` allows the app to both write and publish data:
```jsx
import { UserSession, AppConfig } from 'blockstack';
import { configure } from 'radiks';
const userSession = new UserSession({
appConfig: new AppConfig(['store_write', 'publish_data']),
});
configure({
apiServer: 'http://localhost:1260',
userSession,
});
```
#### Step 2: Add authentication to your application
After your user logs in with Stacks Auth, you'll have some code to save the user's data in your applications `localStorage`. You'll want to use the same `UserSession` you configured with Radiks, which can be fetched from the `getConfig` method.
```jsx
import { User, getConfig } from 'radiks';
const handleSignIn = () => {
const { userSession } = getConfig();
if (userSession.isSignInPending()) {
await userSession.handlePendingSignIn();
await User.createWithCurrentUser();
}
}
```
Calling `User.createWithCurrentUser` does the following:
- Fetch user data that Blockstack.js stores in `localStorage`
- Save the user's public data (including their public key) in Radiks-server
- Find or create a signing key that is used to authorize writes on behalf of this user
- Cache the user's signing key (and any group-related signing keys) to make signatures and decryption happen quickly later on
### Build and run your application
After you have added Radiks to your application, build and run the application. Test the application by logging in with your Stacks ID. Create some data using the application. If you inspect the MongoDB database, you should see the encrypted data stored in the database.
You can specify the `mongoDBUrl` or the `maxLimit` option when initiating the Radiks server in your application.
```jsx
const { setup } = require('radiks-server');
setup({
...myOptions,
});
```
The `mongoDBUrl` option is the MongoDB URL for the Radiks server
The `maxLimit` option is the maximum `limit` field used inside the mongo queries. The default is 1000.
## Where to go next
Creating models for your application's data is where radiks truly becomes helpful. To learn how to use models, see the [Create and use models](/build-apps/indexing/models) section.

618
src/pages/build-apps/examples/public-registry.md

@ -1,618 +0,0 @@
---
title: Public registry app
description: Learn how to write and read state from the Stacks blockchain
duration: 60 minutes
experience: intermediate
tags:
- tutorial
images:
large: /images/registry.svg
sm: /images/registry.svg
---
## Introduction
The [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api) is an API that helps app developers to view and use the state of the Stacks blockchain.
In this tutorial you will extend [the to-dos app](/build-app/tutorial/todos) to share individual lists publicly using the Stacks blockchain.
The registry of shared to-dos lists is implemented by a Clarity smart contract named [`todo-registry`](https://github.com/friedger/blockstack-todos/blob/tut/public-registry/contracts/todo-registry.clar). Data from this contract will be shown in the to-dos app.
The final app will look like this:
![What you'll be creating in this tutorial](/images/todos-public-registry.png)
By the end of this tutorial, you will have:
- Observed transactions while they are added to the blockchain
- Consumed APIs to show recent transactions, data map entries and read-only functions
- Experienced serializing and deserializing Clarity values
## Prerequisites
### Check testnet status
To make sure you're not running into any challenges related to our network, please open up the [Status Checker](https://stacks-status.com/)
and confirm that all systems are operational. If some systems seem to have issues, it is best to wait until they are back up before you proceed with the next steps.
Furthermore, the to-dos app will interact with a smart contract deployed as `ST1234....todo-registry`. The contract source code is available at [GitHub](https://github.com/friedger/blockstack-todos/blob/tut/step1/contracts/todo-registry.clar).
There may already be a deployed version available on the testnet; the [Stacks Explorer](https://explorer.stacks.co/) can be used to search for it.
Alternatively, the contract can be deployed as described in the [hello world tutorial](/smart-contracts/hello-world-tutorial#step-5-deploy-the-contract). Then you have to use the corresponding contract address and name in this tutorial. Throughout this tutorial, we use `ST3YPJ6BBCZCMH71TV8BK50YC6QJTWEGCNDFWEQ15.todo-registry` as an example.
### Tutorials
You should have followed the instructions of the to-dos app tutorial. You should have the code ready on your local machine. It is also helpful to have a basic understanding of Clarity as explained in the counter tutorial. If you are using mocknet or a new, empty testnet you can create transactions following the tutorial about signing transactions.
[@page-reference | grid]
| , /build-apps/guides/transaction-signing, /build-apps/tutorials/todos, /write-smart-contracts/counter-tutorial
### Check your to-dos app
In your code repository of the to-dos app, launch the app by running the `start` command.
```
npm run start
```
In your browser, you should see the to-dos app.
![What the to-dos app looks like so far](/images/todos/landing.png)
## Registering a public URL
The Connect library (that is already used for authentication in the to-dos app) provides also methods to create, sign and broadcast transactions to the Stacks blockchain as explained in the signing transaction tutorial.
### Step 1: Define contract
The contract was already written and deployed to the blockchain. Its address and name is used often in the tutorial and therefore define some constants in the `constants.js` file by adding the following two lines:
```js
// src/assets/constants.jsx
export const CONTRACT_ADDRESS = 'ST3YPJ6BBCZCMH71TV8BK50YC6QJTWEGCNDFWEQ15';
export const CONTRACT_NAME = 'todo-registry';
```
### Step 2: Create registration component
The user needs a UI component to conveniently create `register` transactions on the chain.
The contract function `register` takes two arguments:
```clarity
(define-public (register (name (buff 30)) (url (buff 255))) ...)
```
Create a new filed named `PublicUrlRegistrar.jsx` in the `src/components` folder and add the `PublicUrlRegistrar` component:
```js
// src/components/PublicUrlRegistrar.jsx
import React from 'react';
import { Text } from '@blockstack/ui';
import { useConnect } from '@stacks/connect-react';
import { bufferCVFromString } from '@stacks/transactions';
import { CONTRACT_ADDRESS, CONTRACT_NAME } from '../assets/constants';
export const PublicUrlRegistrar = ({ userSession }) => {
const { doContractCall } = useConnect();
const { username } = userSession.loadUserData();
const url = `${document.location.origin}/todos/${username}`;
const register = () => {
// do the contract call
doContractCall({
contractAddress: CONTRACT_ADDRESS,
contractName: CONTRACT_NAME,
functionName: 'register',
functionArgs: [bufferCVFromString(username), bufferCVFromString(url)],
finished: data => {
console.log({ data });
},
});
};
return (
<>
<Text
color="blue"
cursor="pointer"
fontSize={1}
fontWeight="500"
onClick={() => {
// register the public URL
register();
}}
>
Register on-chain
</Text>
</>
);
};
```
It is a simple button that calls `doContractCall` method of the Connect library when clicked. The method makes an api call to the Stacks authenticator. The authenticator creates a contract call transaction that is signed by the user and then it is broadcasted to the Stacks blockchain as explained in the [transaction signing tutorial](/build-app/guides/signing-transactions).
Note how the arguments are created using `bufferCVFromString`. There are similar methods for all other Clarity types, like `uintCV` or `trueCV`. See the [documentation](https://github.com/blockstack/stacks.js/tree/master/packages/transactions#constructing-clarity-values) of the stacks-transactions library for more details.
### Step 3: Integrate the component in the app
To use the `PublicUrlRegistrar` component, open `Sharer.jsx` and add the following lines after the `Copy Link` text component of the `Sharer` component:
```js
// src/components/Sharer.jsx
<PublicUrlRegistrar userSession={userSession} />
```
Now, you should be able to register your public to-dos list on the blockchain when you click on "Register on-chain."
![How to register the public to-dos list](/images/todos-register-on-chain.png)
## Waiting for transactions
The method `doContractCall` has a callback `finished` that is called after the user confirmed the transaction. This does not mean that the blockchain has accepted and included the transaction on the blockchain. It just means that the transaction was broadcasted to the network. The transaction id is returned in the `finished` callback as `data.txId`. This id can be used to find the transaction and its processing status on the blockchain. The [Stack Blockchain API client library](https://blockstack.github.io/stacks-blockchain-api/client/index.html) provides a convenient method to subscribe to the progress using web sockets.
### Step 1: Add dependency
Add the Stacks Blockchain API client library to `package.json` in the root folder of the to-dos list app:
```bash
npm add @stacks/blockchain-api-client
```
### Step 2: Store the transaction ID
Create a react state variable in the `PublicUrlRegistrar` component that holds the transaction id.
```js
// src/components/PublicUrlRegistrar.jsx
const [txId, setTxId] = useState();
```
and set the value in the `finished` callback
```js
// src/components/PublicUrlRegistrar.jsx
finished: data => {
console.log(data);
setTxId(data.txId);
},
```
### Step 3: Connect to web socket
Add an import for `connectWebSocketClient`:
```js
import { connectWebSocketClient } from '@stacks/blockchain-api-client';
```
Then subscribe for updates of the transaction status by creating a web socket client using `connectWebSocketClient`. Add a call to `client.subscribeTxUpdates` with the transaction id `txId` and a callback function. This callback function is called whenever the transaction status changes. The subscribe process needs to be done only once at the start of the app. Therefore, add it into an effect hook that only depends on the `txId`.
```js
useEffect(() => {
let sub;
const subscribe = async txId => {
const client = await connectWebSocketClient('ws://stacks-node-api.blockstack.org/');
sub = await client.subscribeTxUpdates(txId, update => {
console.log(update);
});
console.log({ client, sub });
};
subscribe(txId);
}, [txId]);
```
You will see update logs in the console. The received object is a transaction status object and has a `tx_status` property. If the status is `success` the transaction was processed and added to the blockchain.
## Reading the registration details
Now that the transaction was processed sucessfully, you can read information about a transaction, in particular, the registry id that was returned by the transaction. The id (`registry-id`) is an unsigned integer.
```clarity
(define-public (register ...)
...
(ok registry-id)
)
```
This information should be shown in a new `Transaction` component using the `TransactionsApi` object provided by the client library.
### Step 1: Create a component representing the register transaction
Create a new file `Transaction.jsx` in folder `src/components` and add the following lines:
```jsx
import React, { useCallback, useEffect, useState } from 'react';
import { Text } from '@blockstack/ui';
import { TransactionsApi } from '@stacks/blockchain-api-client';
export const Transaction = ({ txId }) => {
const [transactionDetails, setTransactionDetails] = useState();
const fetchTransactionDetails = useCallback(async () => {
// fetch transaction from api
}, [txId]);
useEffect(() => {
void fetchTransactionDetails();
}, [fetchTransactionDetails]);
return transactionDetails ? (
<Text fontWeight="500" display="block" mb={0} fontSize={2}>
Registration: TODO
</Text>
) : null;
};
```
### Step 2: Use TransactionsApi
Information about transactions can be retrieved using the `TransactionsApi` object. Add a definition to the top of the `Transaction.jsx` file:
```js
const transcationsApi = new TransactionsApi();
```
-> Note: The contructor takes a configuration argument. It can be used to set the server url. By default, it is the URL for the node hosted by Hiro PBC.
Then in the `fetchTransactionDetails` method add a call to `getTransactionById`. The result is detailed data about the transaction: the status, when it was created, the type, the contract call details, the transaction result and many more.
```js
// fetch transaction from api
if (txId) {
const txDetails = await transcationsApi.getTransactionById({ txId });
setTransactionDetails(txDetails);
}
```
### Step 3: Display transaction details
The text of the transaction details can now be updated with real data. Replace the "TODO" with some result data. For now, let's just show the JSON. Handling Clarity values properly is show in the next section.
```js
<Text fontWeight="500" display="block" mb={0} fontSize={2}>
Registration: Result {JSON.stringify(transactionDetails.tx_result)}
</Text>
```
### Step 4: Use transaction component
To bring the pieces together, the `Transaction` component should be added to the app. For this, extend the `PublicUrlRegistrar` first so that it can hand over the transaction id when the register transaction was successfully processed.
Add a react state variable `success` after the `txId` state variable
```js
// src/components/PublicUrlRegistrar.jsx
const [success, setSuccess] = useState();
```
and set it to true in the update callback if then transaction update status is `success`
```js
sub = await client.subscribeTxUpdates(txId, update => {
console.log(update);
setSuccess(update.tx_status === 'success');
});
```
Finally, add the `Transaction` component at the end of the `PublicUrlRegistrar` component if the transaction was successfully processed.
```js
{
success && <Transaction txId={txId} />;
}
```
You should now be able to see an update in the UI if the transaction was successfully added to the blockchain. In real apps, the progress status could be indicated by colors or other UI elements.
## Show recent activities
Similar to the `TransactionApi`, the `AccountsApi` provides easy access to account-related information. The api will be used in this section to show recent activities for the to-dos list registry.
### Step 1: Create recent activities component
Create a new file `RecentActivities.jsx` in folder `src/components` and add the following lines:
```js
import React, { useCallback, useEffect, useState } from 'react';
import { Text } from '@blockstack/ui';
import { AccountsApi } from '@stacks/blockchain-api-client';
import { CONTRACT_ADDRESS, CONTRACT_NAME } from '../assets/constants';
const accountsApi = new AccountsApi();
export const RecentActivities = () => {
const [activities, setActivities] = useState();
const fetchActivities = useCallback(async () => {
// fetch activities
}, []);
useEffect(() => {
void fetchActivities();
}, [fetchActivities]);
console.log({ activities });
return activities && activities.length > 0 ? (
<Flex
display="block"
position="absolute"
bottom="0"
width="100%"
justifyContent="space-between"
px={4}
py={3}
>
<Box px={3} background="#efefef">
<Text fontWeight="500" display="block" mb={0} fontSize={3}>
Public Todos List Registry
</Text>
<Text fontWeight="500" display="block" mb={0} fontSize={2}>
Recent Activities:{' '}
{activities.map((activity, key) => {
<Text key={key}>{JSON.stringify(activity)}</Text>;
})}
</Text>
</Box>
</Flex>
) : null;
};
```
Next, add the following line to `Apps.jsx` after the `SignIn` and `TodoList` fragment:
```js
<div className="site-wrapper-inner">
...
<RecentActivities />
</div>
```
### Step 2: Use AccountsApi
Now, add a call to the `getAccountTransactions` method. This method can be used for users and for contracts. It returns transactions that are related to the given account. The default is to return the last 20 transactions.
```js
const fetchActivities = useCallback(async () => {
// fetch activities
const response = await accountsApi.getAccountTransactions({
principal: `${CONTRACT_ADDRESS}.${CONTRACT_NAME}`,
});
console.log(response);
}, []);
```
-> For users, the principal argument is just the user's stacks address.
### Step 3: Filter for successful contract calls
Contract calls could fail with an error, for example if the public url is too long. The transactions should be filtered for successful contract calls:
```js
const contractCallsOnly = r => {
console.log(r);
return r.tx_status === 'success' && r.tx_type === 'contract_call';
};
```
Then apply the filter to the response result and store it in the `activities` variable.
The final callback `fetchActivities` looks like this:
```js
const fetchActivities = useCallback(async () => {
// fetch activities
const response = await accountsApi.getAccountTransactions({
principal: `${CONTRACT_ADDRESS}.${CONTRACT_NAME}`,
});
console.log(response);
setActivities(response.results.filter(contractCallsOnly));
}, []);
```
You should now see a list of hex strings representing the transactions.
![How recent activities as json look](/images/todos-recent-activities-json.png)
### Step 4: Extract registry IDs
For the last step, the returned details need to be decoded and and formatted appropriately to be readable. The `register` function returns the transaction id as a result. As mentioned above, it is an unsigned int (`uint`) encoded as a hex string (`tx_result.hex`) like `0x0100000000000000000000000000000001`. The method `hexToCV` decodes the string and creates a `ClarityValue` object. In this case it is an object of type `UIntCV`.
The client library defines types for all Clarity types. The corresponding type gives access to more properties. For example, `UIntCV` has a `value` property for its number value.
```js
hexToCV(tx_result.hex).value;
```
You can also use `cvToString` for a simple conversion of the `ClarityValue` object to a string
```js
cvToString(hexToCV(tx_result.hex));
```
### Step 5: Display register details
In addition to `tx_result`, the transaction object also contains a timestamp (`burn_block_time_iso`) and a `contract_call` object. The contract call object has properties like the function name (`contract_call.function_name`). Using this, the final representation for `register` transactions should contain the deserialized registry id and the timestamp. It could look like this:
```js
{
activities.map((activity, key) => {
if (activity.contract_call.function_name === 'register') {
const result = hexToCV(activity.tx_result.hex.substr(2));
return (
<React.Fragment key={key}>
Entry {result.value.toString()} was registered at {activity.burn_block_time_iso}.{' '}
</React.Fragment>
);
} else {
return null;
}
});
}
```
-> Note: The `AccountsApi` and other Api methods provide parameters to page through the results using `limit` and `offset`. See [the docs](https://blockstack.github.io/stacks-blockchain-api/client/interfaces/getaccounttransactionsrequest.html) for more details.
=> Congratulations. You just implemented a list of recent activities that was fetched from the blockchain.
![How recent activities look like](/images/todos-recent-activities.png)
## Fetch the first to-dos list
There are two other ways to get state information from the blockchain: read-only functions and data map entries. Read-only functions were already discussed in the [Clarity counter tutorial](/write-smart-contracts/counter-tutorial). They do not require a transaction to complete. Data maps in Clarity are maps that can be read by any user. See the [Clarity reference](/references/language-functions#define-map) for more details.
The `todo-registry` contract defines a read-only function `owner-of?` that returns the owner of a registry entry and a data map for details about entries:
```clarity
(define-read-only (owner-of (registration-id uint)) ... )
(define-map registry
((registry-id uint))
(
(name (buff 30))
(url (buff 255))
)
)
```
Let's add the owner information and the details for the first ever registered to-dos list (with `registry-id` 1) to the `RecentActivities` component. The `SmartContractsApi` of the client library provides methods to read these data from the blockchain.
### Step 1: Add state variable for first registration
At first, add a new state variable `firstRegistration`.
```js
// src/components/RecentActvities.jsx
const [firstRegistration, setFirstRegistration] = useState();
```
Then, define a new callback `fetchRegistration` that will contain the calls to the blockchain and update the state variable. Add the following code after the `fetchRegistration` callback and update the effect hook.
```js
const fetchRegistration = useCallback(async () => {
// fetch newest registration
// fetch owner
// fetch public URL and name
});
useEffect(() => {
void Promise.all([fetchActivities(), fetchRegistration()]);
}, [fetchActivities, fetchRegistration]);
```
### Step 2: Query owner of the first to-dos list
To query the read-only functions of the smart contract, a `SmartContractsApi` object needs to be created, in the same way as the `AccountsApi` object.
```js
const smartContractsApi = new SmartContractsApi();
```
Use that api object in the `fetchRegistration` callback to call the read-only method `owner-of?` like this:
```js
const fetchRegistration = useCallback(async () => {
const ownerResponse = await smartContractsApi.callReadOnlyFunction({
contractAddress: CONTRACT_ADDRESS,
contractName: CONTRACT_NAME,
functionName: 'owner-of?',
readOnlyFunctionArgs: ReadOnlyFunctionArgsFromJSON({
sender: CONTRACT_ADDRESS,
arguments: [cvToHex(uintCV(1))],
}),
});
console.log(ownerResponse);
});
```
The arguments of the read-only function are provided as an array of hex encoded `ClarityValue`s.
The helper method `cvToHex` converts a `ClarityValue` into a hex encoded string. It is the reverse function of `hexToCV` that was used to decode the transaction response.
The `sender` can be any stacks address and is not relevant for the `owner-of?` function.
### Step 3: Deserialize the response
A read-only function call returns a response object with an `okay` property and it's `true`, it contains a `result` property with the hex encoded Clarity value. Otherwise the `cause` property describes the error. The helper methods `hexToCV` is used again to decode the result. The result is an optional address of type `SomeCV`. It has a property `value` that contains the owner address. Using `cvToString` with `ownerCV.value` returns the address as a string:
```js
if (ownerResponse.okay) {
const ownerCV = hexToCV(ownerResponse.result);
const owner = cvToString(ownerCV.value);
}
```
### Step 4: Fetch a map entry
For the registry details, the data map `registry` of the contract can be queried to receive the username and the registered url. The method `SmartContractsApi.getDataMapEntry` method expects a hex encoded string representing the key. The map key always consists of a Clarity tuple. For the `registry` map, the only value of the tuple is the `registry-id`. Therefore, the hex string is created like this:
```js
const key = cvToHex(tupleCV({ 'registry-id': uintCV(1) }));
```
With the encoded key, add the following lines to the `fetchRegistry` callback
```js
const mapEntryResponse = await smartContractsApi.getContractDataMapEntry({
contractAddress: CONTRACT_ADDRESS,
contractName: CONTRACT_NAME,
mapName: 'registry',
key,
});
console.log({ mapEntryResponse });
```
### Step 5: Deserialized data map entry
The response object has a property `data` that contains the hex encoded result. If the key was invalid the result is an object of type `OptionalNone`. Otherwise, the result is of type `OptionalSome` and its `value` property contains the map entry. It is always a tuple (type `TupleCV`). Its `data` property contains the registered name and public url as `ClarityValue`s.
Therefore the deserialization of the result looks like this:
```js
const optionalMapEntry = hexToCV(mapEntryResponse.data);
if (optionalMapEntry.type === ClarityType.OptionalSome) {
const mapEntryCV = optionalMapEntry.value;
const registryData = mapEntryCV.data;
}
```
Add this to the `fetchRegistry` callback and then update the react state variable with all the gathered data. Instead of using `cvToString` you can also access the buffer value directly and convert it to a string. This is useful for the `url` because `cvToString` put the string in quotes.
```js
setFirstRegistration({
owner,
name: cvToString(registryData.name),
url: registryData.url.buffer.toString(),
});
```
### Step 6: Display registry entry
Update the UI to display the data of the first registry entry.
```jsx
{
firstRegistration && (
<>
<Text fontWeight="500" display="block" mb={0} fontSize={0}>
First registration in 'Public Todos List registry' by
</Text>
<Text fontSize={2}>
<a href={firstRegistration.url}>{firstRegistration.name}</a>{' '}
</Text>
<Text fontSize={0}>using address {firstRegistration.owner}</Text>
<br />
</>
);
}
```
=> Congratulations. You just called read-only functions and map entries from the public registry, without managing a server.
With the completion of this tutorial, you:
- Observed the progress of transaction processing
- Consumed APIs to show recent transactions, data map entries and read-only functions
- Experienced serializing and deserializing Clarity values
The full source code and smart contract code for the public registry is available at [blockstack-todo-registry](https://github.com/friedger/blockstack-todos).

280
src/pages/build-apps/examples/todos.md

@ -1,280 +0,0 @@
---
title: To-dos app
description: Review authentication and data storage integration
experience: beginners
duration: 30 minutes
tags:
- tutorial
images:
large: /images/pages/todo-app.svg
sm: /images/pages/todo-app-sm.svg
---
![What you'll be studying in this tutorial](/images/todos/home.png)
## Introduction
In this tutorial, you will learn about Stacks authentication and storage by installing,
running and reviewing the code for a "To-dos" app built with Stacks authentication and storage.
This app highlights the following platform functionality:
- Generate _Secret Key_ with associated BNS username to authenticate app
- Add, edit and delete encrypted app data with Gaia
- Decrypt data on Gaia for public sharing by URL
- Deauthenticate and re-authenticate app with _Secret Key_
[Try the app](https://todos.blockstack.org) or [view its code on GitHub](https://github.com/blockstack/todos).
Existing familiarity with [React](https://reactjs.org/) is recommended for reviewing this app's code.
## Install and run the app
You must have recent versions of Git and [Node.js](https://nodejs.org/en/download/)
(v12.10.0 or greater) installed already.
### Step 1: Install the code and its dependencies
```bash
git clone https://github.com/blockstack/todos && cd todos
npm install
```
### Step 2: Run the application:
```bash
npm run start
```
You should see output similar to the following:
```bash
Compiled successfully
You can now view to-dos in the browser.
http://localhost:3000/
Note that the development build is not optimized.
To create a production build, use yarn build.
```
### Step 3: Open your local browser to [`http://localhost:3000`](http://localhost:3000) if it doesn't open automatically.
You should see the app's landing page:
!["To-dos" app landing screen](/images/todos/landing.png)
## Onboard into your first Stacks app
### Step 1: Choose **Get started** to start onboarding into the app.
The app displays a standardized introductory modal using the `@stacks/connect` library.
![Modal displayed by showConnect function](/images/todos/get-started.png)
This modal is displayed using the `authenticate` function exported by the `src/auth.js` module, which organizes all Stacks resources needed for authentication in the app:
```js
// src/auth.js
import { AppConfig, UserSession, showConnect } from '@stacks/connect';
import { Person } from '@stacks/profile';
const appConfig = new AppConfig(['store_write', 'publish_data']);
export const userSession = new UserSession({ appConfig });
export function authenticate() {
showConnect({
appDetails: {
name: 'Todos',
icon: window.location.origin + '/logo.svg',
},
redirectTo: '/',
finished: () => {
window.location.reload();
},
userSession: userSession,
});
}
export function getUserData() {
return userSession.loadUserData();
}
export function getPerson() {
return new Person(getUserData().profile);
}
```
The `authenticate` function implements the `showConnect` function imported from the `connect` package of Stacks.js.
`showConnect` triggers the display of a modal that initiates the authentication process for users, one in which they'll authenticate with a _Secret Key_ that's used to encrypt their private data.
The `showConnect` function accepts a number of properties within a parameter object such as:
- The app's `name` and `icon`: provided as strings comprising the `appDetails` object property.
- The `redirectTo` string: used to provide a URL to which the user should be redirected upon successful authentication. The `onFinish` callback serves a similar purpose by handling successful authentication within a context of a popup window.
- The `userSession` object: used to pass the [scopes](/build-apps/guides/authentication#initiate-usersession-object) needed by the app.
Note how the `userSession` object is created at the beginning of this module by leveraging an `AppConfig` object that's first initiated with all relevant scopes.
The [`UserSession`](https://blockstack.github.io/stacks.js/classes/usersession.html) and [`AppConfig`](https://blockstack.github.io/stacks.js/classes/appconfig.html) classes are themselves imported from the `@stacks/auth` library.
In the separate `src/components/App.jsx` component, you can see how
`componentDidMount` loads the user's data into the app's state, whether upon redirect post-authentication with `userSession.handlePendingSignIn()` or upon detection of an existing session with `userSession.isUserSignedIn()`:
```jsx
// src/components/App.jsx
import { userSession } from '../auth';
...
componentDidMount() {
if (userSession.isSignInPending()) {
userSession.handlePendingSignIn().then((userData) => {
window.history.replaceState({}, document.title, "/")
this.setState({ userData: userData})
});
} else if (userSession.isUserSignedIn()) {
this.setState({ userData: userSession.loadUserData() });
}
}
```
### Step 2: Choose **Get started** to generate a _Secret Key_.
The app triggers a popup window in which [Stacks Authenticator](https://github.com/blockstack/ux/tree/master/packages/app)
loads from [`app.blockstack.org`](http://app.blockstack.org/) and begins generating a new _Secret Key_.
!["Secret Key generation" screen](/images/todos/secret-key-generation.png)
### Step 3: Choose **Copy Secret Key** to copy your _Secret Key_ to the clipboard.
The _Secret Key_ is a unique 12-word [mnemonic phrase](https://en.bitcoinwiki.org/wiki/Mnemonic_phrase) that
empowers the user not only to access Stacks apps securely and independently. It's also used to encrypt
all of the private data they create and manage with Stacks apps.
_Secret Keys_ are like strong passwords. However, they can never be recovered if lost or reset if stolen.
As such, it's paramount that users handle them with great care.
!["Copy Secret Key" screen](/images/todos/copy-secret-key.png)
### Step 4: Choose **I've saved it** to confirm you've secured your _Secret Key_ in a suitable place.
!["I've saved it" screen](/images/todos/saved-secret-key.png)
### Step 5: Enter a username value and choose **Continue**
The username will be used by the app to generate a URL for sharing your to-dos, should you choose to make them public.
It is registered on the Stacks blockchain with [BNS](/technology/naming-system) and associated with your _Secret Key_.
!["Choose username" screen](/images/todos/choose-username.png)
### Done: You've now completed onboarding into the app
## Add, edit and delete to-dos privately
Once you've authenticated the app, you can start adding to-dos by entering values into the "Write your to do"
field and hitting "Enter."
!["To-dos" app home screen](/images/todos/home.png)
The data for all to-dos are saved as JSON to the Gaia hub linked to your Secret Key using the [`putFile`](http://blockstack.github.io/stacks.js/classes/storage.html#putfile) method of the `storage` object in the `src/storage.js` module, which manages all data storage for the app:
```js
// src/storage.js
import { userSession } from './auth';
import { Storage } from '@stacks/storage';
const storage = new Storage({ userSession });
...
export const saveTasks = async (userSession, tasks, isPublic) => {
await storage.putFile(TASKS_FILENAME, JSON.stringify({ tasks, isPublic }), {
encrypt: !isPublic,
});
};
```
These to-dos are subsequently loaded using the [`getFile`](http://blockstack.github.io/stacks.js/globals.html#getfile)
method of the same object in the same module:
```js
// src/storage.js
import { userSession } from './auth';
import { Storage } from '@stacks/storage';
const storage = new Storage({ userSession });
...
export const fetchTasks = async (userSession, username) => {
const tasksJSON = await storage.getFile(TASKS_FILENAME, {
decrypt: false,
username: username || undefined,
});
...
};
```
The `storage` object is instantiated with the `Storage` class of the `@stacks/storage` library and `userSession` to ensure that all storage calls are made with the user's Gaia hub.
By default, the `putFile` and `getFile` methods automatically encrypt data when saved and decrypt it when retrieved,
using the user's _Secret Key_. This ensures that only the user has the ability to view this data.
When deleting a todo, the same `putFile` method is used to save a new JSON array of to-dos that excludes the deleted todo.
## Publish your to-dos publicly
Select "Make public" to make your to-dos accessible to the public for sharing via URL.
!["Public to-dos" screen](/images/todos/home-public.png)
This will call `saveTasks` with the `isPublic` parameter set to `true`, which is used to disable encryption when using `putFile`.
The app will now show all of your to-dos to anyone who visits the URL displayed with your Stacks username as a suffix.
## Sign out and see your public tasks
Select "Sign out" to deauthenticate the app with your Stacks account.
This calls the [`signUserOut`](https://blockstack.github.io/stacks.js/classes/usersession.html#signuserout) method
of the `userSession` object within `src/components/Header.jsx`.
Now visit the URL that was provided to you when you made your tasks public. This URL has the format `/todos/:username`, so if your username were `janedoe.id.blockstack`, the URL would be `localhost:3000/todos/janedoe.id.blockstack`.
When you visit this page, the `TodoList.jsx` component detects that there is a username in the URL.
When there is a username, it calls `fetchTasks`, this time providing the `username` argument. This `username`
option is then passed to `getFile`, which will look up where that user's tasks are stored.
## Sign back in
At this point, you will be logged out from the app but not you'll still have an active session with the Stacks
app itself on [app.blockstack.org](https://app.blockstack.org). Navigate to app.blockstack.org and select "Sign out" there if you want to deauthenticate the Stacks app as well.
Once signed out, select "Sign In" to sign back in with your _Secret Key_.
If you've previously deauthenticated the Stacks app, you'll see a prompt to enter your _Secret Key_:
!["Sign in" screen](/images/todos/sign-in.png)
The preceding screen is omitted if you have an active session with the Stacks app already.
Then you'll be presented with the option to select an existing username associated with your _Secret Key_ or
create a new one if you wish to authenticate the app with a different identity and data set:
!["Choose account" screen](/images/todos/choose-account.png)
You'll now see your to-dos as an authenticated user for the username you've chosen.
## Learn more
Read [the Stacks.js reference](https://blockstack.github.io/stacks.js/) to learn more about the
libraries used in this tutorial.

274
src/pages/build-apps/guides/authentication.md

@ -1,274 +0,0 @@
---
title: Authentication
description: Register and sign in users with identities on the Stacks blockchain
images:
large: /images/pages/write-smart-contracts.svg
sm: /images/pages/write-smart-contracts-sm.svg
---
## Introduction
This guide explains how to authenticate users with the [`connect`](https://github.com/blockstack/ux/tree/master/packages/connect#stacksconnect) package of Stacks.js.
Authentication provides a way for users to identify themselves to an app while retaining complete control over their credentials and personal details. It can be integrated alone or used in conjunction with [transaction signing](/build-apps/tutorials/transaction-signing) and [data storage](/build-apps/tutorials/data-storage), for which it is a prerequisite.
Users who register for your app can subsequently authenticate to any other app with support for the [Blockchain Naming System](/build-apps/references/bns) and vice versa.
See the To-dos app tutorial for a concrete example of this feature in practice.
[@page-reference]
| /build-apps/tutorials/todos
## How it works
The authentication flow with Stacks is similar to the typical client-server flow used by centralized sign in services (for example, OAuth). However, with Stacks the authentication flow happens entirely client-side.
An app and authenticator, such as [the Stacks Wallet](https://www.hiro.so/wallet/install-web), communicate during the authentication flow by passing back and forth two tokens. The requesting app sends the authenticator an `authRequest` token. Once a user approves authentication, the authenticator responds to the app with an `authResponse` token.
These tokens are are based on [a JSON Web Token (JWT) standard](https://tools.ietf.org/html/rfc7519) with additional support for the `secp256k1` curve used by Bitcoin and many other cryptocurrencies. They are passed via URL query strings.
See the [`authRequest`](#authrequest-payload-schema) and [`authResponse`](#authresponse-payload-schema) payload schemas below for more details about what data they contain.
When a user chooses to authenticate an app, it sends the `authRequest` token to the authenticator via a URL query string with an equally named parameter:
`https://wallet.hiro.so/...?authRequest=j902120cn829n1jnvoa...`
When the authenticator receives the request, it generates an `authResponse` token for the app using an _ephemeral transit key_ . The ephemeral transit key is just used for the particular instance of the app, in this case, to sign the `authRequest`.
The app stores the ephemeral transit key during request generation. The public portion of the transit key is passed in the `authRequest` token. The authenticator uses the public portion of the key to encrypt an _app private key_ which is returned via the `authResponse`.
The authenticator generates the app private key from the user's _identity address private key_ and the app's domain. The app private key serves three functions:
1. It is used to create credentials that give the app access to a storage bucket in the user's Gaia hub
2. It is used in the end-to-end encryption of files stored for the app in the user's Gaia storage.
3. It serves as a cryptographic secret that apps can use to perform other cryptographic functions.
Finally, the app private key is deterministic, meaning that the same private key will always be generated for a given Stacks address and domain.
The first two of these functions are particularly relevant to [data storage with Stacks.js](/build-apps/guides/data-storage).
[Learn more about keypairs](#key-pairs) used by authentication.
## Install dependency
The following dependency must be installed:
```
npm install @stacks/connect
```
## Initiate userSession object
Apps keep track of user authentication state with the `userSession` object, initiated with the `UserSession` and `AppConfig` classes:
```js
import { AppConfig, UserSession } from '@stacks/connect';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
```
The main thing to decide here is what permission scopes your app needs from the user during authentication.
Apps may request any of the following scopes:
| Scope | Definition |
| -------------- | ------------------------------------------------------------------------------- |
| `store_write` | Read and write data to the user's Gaia hub in an app-specific storage bucket. |
| `publish_data` | Publish data so other users of the app can discover and interact with the user. |
The default scopes are `['store_write']` if no `scopes` array is provided when initializing the `appConfig` object.
We recommend you initiate the `userSession` object just once in your app then reference it using imports where needed.
## Initiate authentication flow
Apps prompt both new and existing users to authenticate with the `showConnect` function:
```js
import { AppConfig, UserSession, showConnect } from '@stacks/connect';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
function authenticate() {
showConnect({
appDetails: {
name: 'My App',
icon: window.location.origin + '/my-app-logo.svg',
},
redirectTo: '/',
onFinish: () => {
let userData = userSession.loadUserData();
// Save or otherwise utilize userData post-authentication
},
userSession: userSession,
});
}
```
`showConnect` triggers the display of a modal that initiates the authentication process for users, one in which they'll authenticate with a _Secret Key_ that's used to encrypt their private data.
![Modal displayed by showConnect function](/images/todos/get-started.png)
The `showConnect` function accepts a number of properties within a parameter object such as:
- The app's `name` and `icon`: provided as strings comprising the `appDetails` object property.
- The `redirectTo` string: used to provide a URL to which the user should be redirected upon successful authentication. The `onFinish` callback serves a similar purpose by handling successful authentication within a context of a popup window.
- The `userSession` object initiated above.
Once the user selects the button presented in this modal, they are passed to the Stacks Wallet for authenticator with the `authRequest` token as a GET parameter. From there they can confirm authentication and generate a new _Secret Key_ or Stacks identity before doing so, as needed before coming back to the app.
## Handle pending authentication
Unless the user has confirmed authentication within the context of a popup window, they will get redirected back to the app via the `redirectTo` address provided above, at which point the app needs to handle the pending authentication state using the `authResponse` value provided as a GET parameter:
```jsx
import { AppConfig, UserSession, showConnect } from '@stacks/connect';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
window.onload = function () {
if (userSession.isSignInPending()) {
userSession.handlePendingSignIn().then(userData => {
// Save or otherwise utilize userData post-authentication
});
} else if (userSession.isUserSignedIn()) {
// Handle case in which user is already authenticated
}
};
```
The `isSignInPending` method of the `userSession` object is used to detect whether the user needs to handle a pending authentication state upon page load.
The `handlePendingSignIn` method is then used to handle that state, returning a `userData` object with all the data needed to save the user's information into their session.
The authenticated state can later be detected by the `isUserSignedIn` method in case any particular handling is needed then.
~> It's especially important to implement `handlePendingSignIn` within the context of mobile apps.
If the user has indeed confirmed authentication in the context of a popup window, the authenticator will resolve the pending authentication state automatically with the app within the parent window.
It will then trigger the `onFinish` function provided above, which can be used similarly to save the user's information into their session as retrieved with `userSession.loadUserData()`.
## Usage in React Apps
Import the `useConnect` from the [`connect-react`](https://github.com/blockstack/ux/tree/master/packages/connect-react) package to integrate authentication more seamlessly into React apps.
```
npm install @stacks/connect-react
```
```jsx
import { useConnect } from '@stacks/connect-react';
const AuthButton = () => {
const { doOpenAuth } = useConnect();
return <Button onClick={() => doOpenAuth()}>Authenticate</Button>;
};
```
## Key pairs
Authentication with Stacks makes extensive use of public key cryptography generally and ECDSA with the `secp256k1` curve in particular.
The following sections describe the three public-private key pairs used, including how they're generated, where they're used and to whom private keys are disclosed.
### Transit private key
The transit private is an ephemeral key that is used to encrypt secrets that
need to be passed from the authenticator to the app during the
authentication process. It is randomly generated by the app at the beginning of
the authentication response.
The public key that corresponds to the transit private key is stored in a single
element array in the `public_keys` key of the authentication request token. The
authenticator encrypts secret data such as the app private key using this
public key and sends it back to the app when the user signs in to the app. The
transit private key signs the app authentication request.
### Identity address private key
The identity address private key is derived from the user's keychain phrase and
is the private key of the Stacks username that the user chooses to use to sign in
to the app. It is a secret owned by the user and never leaves the user's
instance of the authenticator.
This private key signs the authentication response token for an app to indicate that the user approves sign in to that app.
### App private key
The app private key is an app-specific private key that is generated from the
user's identity address private key using the `domain_name` as input.
The app private key is securely shared with the app on each authentication, encrypted by the authenticator with the transit public key. Because the transit key is only stored on the client side, this prevents a man-in-the-middle attack where a server or internet provider could potentially snoop on the app private key.
## authRequest Payload Schema
```jsx
const requestPayload = {
jti, // UUID
iat, // JWT creation time in seconds
exp, // JWT expiration time in seconds
iss, // legacy decentralized identifier generated from transit key
public_keys, // single entry array with public key of transit key
domain_name, // app origin
manifest_uri, // url to manifest file - must be hosted on app origin
redirect_uri, // url to which the authenticator redirects user on auth approval - must be hosted on app origin
version, // version tuple
do_not_include_profile, // a boolean flag asking authenticator to send profile url instead of profile object
supports_hub_url, // a boolean flag indicating gaia hub support
scopes, // an array of string values indicating scopes requested by the app
};
```
## authResponse Payload Schema
```jsx
const responsePayload = {
jti, // UUID
iat, // JWT creation time in seconds
exp, // JWT expiration time in seconds
iss, // legacy decentralized identifier (string prefix + identity address) - this uniquely identifies the user
private_key, // encrypted private key payload
public_keys, // single entry array with public key
profile, // profile object
username, // Stacks username (if any)
core_token, // encrypted core token payload
email, // email if email scope is requested & email available
profile_url, // url to signed profile token
hubUrl, // url pointing to user's gaia hub
version, // version tuple
};
```
## Decode authRequest or authResponse
To decode a token and see what data it holds:
1. Copy the `authRequest` or `authResponse` string from the URL during authentication.
2. Navigate to [jwt.io](https://jwt.io/).
3. Paste the full token there.
The output should look similar to below:
```json
{
"jti": "f65f02db-9f42-4523-bfa9-8034d8edf459",
"iat": 1555641911,
"exp": 1555645511,
"iss": "did:btc-addr:1ANL7TNdT7TTcjVnrvauP7Mq3tjcb8TsUX",
"public_keys": ["02f08d5541bf611ded745cc15db08f4447bfa55a55a2dd555648a1de9759aea5f9"],
"domain_name": "http://localhost:8080",
"manifest_uri": "http://localhost:8080/manifest.json",
"redirect_uri": "http://localhost:8080",
"version": "1.3.1",
"do_not_include_profile": true,
"supports_hub_url": true,
"scopes": ["store_write", "publish_data"],
"private_key": "4447bfa55a55a2dd555648a1d02f08d759aea5f945cc15db08f"
}
```
The `iss` property is a decentralized identifier or `did`. This identifies the user and the username to the app. The specific `did` is a `btc-addr`.

172
src/pages/build-apps/guides/data-storage.md

@ -1,172 +0,0 @@
---
title: Data storage
description: Save and retrieve data for users with Gaia
images:
large: /images/gears.svg
sm: /images/gears.svg
---
## Introduction
This guide explains how to save and retrieve data for users with [Gaia](/build-apps/references/gaia) by implementing the [`connect`](https://github.com/blockstack/ux/tree/master/packages/connect#stacksconnect) and [`storage`](https://github.com/blockstack/ux/tree/master/packages/storage#stacksstorage) packages of Stacks.js.
Data storage provides a way for users to save both public and private data off-chain while retaining complete control over it.
Storing data off the Stacks blockchain ensures that apps can provide users with high performance and high availability for data reads and writes without the involvement of centralized parties that could compromise their privacy or accessibility.
See the To-dos app tutorial for a concrete example of this feature in practice.
[@page-reference]
| /build-apps/tutorials/todos
## Install dependencies
The following dependencies must be installed:
```
npm install @stacks/connect @stacks/storage
```
## Initiate session
Users must authenticate to an app before the `storage` package will work to save or retrieve data on their behalf.
See the authentication guide before proceeding to integrate the following data storage capabilities in cases where `userSession.isUserSignedIn()` returns `true`.
[@page-reference]
| /build-apps/guides/authentication
## Save data for session user
Gaia serves as a key-value store in which data is saved and retrieved as files to and from Gaia hubs owned by, or managed for, users.
The default Gaia hub for users who authenticate to apps with [the Stacks Wallet](https://www.hiro.so/wallet/install-web) is run by Hiro PBC at `https://gaia.blockstack.org/`. It supports files up to 25 megabytes in size.
-> We recommend breaking data instances greater than 25 MB into several files, saving them individually, and recomposing them on retrieval.
These files can comprise any type of data such as text, image, video or binary.
Files are often saved as strings that represent stringified JSON objects and contain a variety of properties for a particular model.
To save a file, first instantiate a `storage` object using the `userSession` object for an authenticated user. Then proceed to call its `putFile` method with relevant parameters:
```js
import { AppConfig, UserSession } from '@stacks/connect';
import { Storage } from '@stacks/storage';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
const storage = new Storage({ userSession });
let fileName = 'car.json';
let fileData = {
color: 'blue',
electric: true,
purchaseDate: '2019-04-03',
};
const options = {
encrypt: true,
};
let fileUrl = storage.putFile(fileName, JSON.stringify(fileData), options).then(() => {
// Handle any execution after data has been saved
});
```
The `options` parameter object contains an `encrypt` property that when set to `true` indicates that the data should be encrypted with the user's app private key before saved to their Gaia hub. All data will be encrypted as such by default if the `encrypt` property or the `options` object itself is omitted entirely.
If the `encrypt` property is set to `false`, the data will be saved completely unencrypted and available to everyone online with public access to the user's Gaia hub.
Whereas saving privately encrypted data is possible for all authenticated apps with the [`store_write`](https://blockstack.github.io/stacks.js/enums/authscope.html#store_write) scope, the user must have previously granted the [`publish_data`](https://blockstack.github.io/stacks.js/enums/authscope.html#publish_data) scope as well during authentication for the app to save publicly unencrypted data.
The `putFile` method returns the URL where the the file can be retrieved from the user's Gaia hub, as used here to set the value of `fileUrl`.
-> You'll need to save an entirely new string of modified data using `putFile` with the same `fileName` every time you want to update a record. There is no separate update method.
## Get data for session user
To retrieve data previously saved for a user with an app, call the `getFile` method available from the `storage` object:
```js
import { AppConfig, UserSession } from '@stacks/connect';
import { Storage } from '@stacks/storage';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
const storage = new Storage({ userSession });
let fileName = 'car.json';
const options = {
decrypt: true,
};
storage.getFile(fileName, options).then(fileData => {
// Handle any execution that uses decrypted fileData
});
```
Note how the `decrypt` property in the `options` object here should implement the same boolean value as used for `encrypt` initially upon saving the data with `putFile`. The `decrypt` property will default to `true` if omitted.
Encrypted files need `decrypt` set to `true` so the app knows to decrypt the data with the user's app private key before made available in the callback here as `fileData`.
## Get data for other user
Apps can also retrieve public data saved by users other than the one with the active session, granted those users have registered usernames via the [Blockchain Naming System](/build-apps/references/bns).
Simply indicate the username of such a user in the `options` object:
```js
import { AppConfig, UserSession } from '@stacks/connect';
import { Storage } from '@stacks/storage';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
const storage = new Storage({ userSession });
let fileName = 'car.json';
const options = {
username: 'markmhendrickson.id.blockstack',
};
storage.getFile(fileName, options).then(fileData => {
// Handle any execution that uses decrypted fileData
});
```
This `getFile` call will retrieve data found at the given `fileName` path from the storage bucket of the Gaia hub that maps to the user who registered the given `username` and this particular app as hosted at the current domain.
Set an additional `app` property within `options` to retrieve data for a user as saved by an app hosted at a separate domain:
```js
const options = {
app: 'https://example.org',
username: 'markmhendrickson.id.blockstack',
};
```
This will cause the `getFile` call to retrieve data found in a separate storage bucket for the indicated app on the user's Gaia hub.
## Delete data for session user
Call the `deleteFile` method on `storage` to remove data found at a particular file path for the active session user:
```js
import { AppConfig, UserSession } from '@stacks/connect';
import { Storage } from '@stacks/storage';
const appConfig = new AppConfig(['store_write', 'publish_data']);
const userSession = new UserSession({ appConfig });
const storage = new Storage({ userSession });
let fileName = 'car.json';
storage.deleteFile(fileName).then(() => {
// Handle any execution after file has been deleted
});
```
-> Apps can save and delete data only for the active session user.

202
src/pages/build-apps/guides/integrate-stacking-delegation.md

@ -1,202 +0,0 @@
---
title: Integrate Stacking delegation
description: Learn how to add Stacking delegation capabilities to your wallet or exchange
icon: TestnetIcon
experience: advanced
duration: 60 minutes
tags:
- tutorial
images:
sm: /images/pages/stacking-rounded.svg
---
## Introduction
In this tutorial, you'll learn how to integrate the Stacking delegation flow by interacting with the respective smart contract, as well as reading data from the Stacks blockchain.
This tutorial highlights the following functionality:
- As an account holder: delegate STX tokens
- As a delegator: Stack STX token on behalf of the account holder
- As a delegator: Commit to Stacking with all delegated STX tokens
## Requirements
First, you'll need to understand the [Stacking delegation mechanism](/understand-stacks/stacking).
You'll also need [NodeJS](https://nodejs.org/en/download/) `12.10.0` or higher to complete this tutorial. You can verify your installation by opening up your terminal and run the following command:
```bash
node --version
```
Finally, you need to have access to at least two accounts (STX account holder and delegator). For testing purposes on the testnet, you can use the CLI to generate them:
```shell
stacks make_keychain -t > account.json
stacks make_keychain -t > delegator.json
```
You can use the faucet to obtain testnet STX tokens for the test account. Replace `<stxAddress>` below with your address:
```shell
curl -XPOST "https://stacks-node-api.testnet.stacks.co/extended/v1/faucets/stx?address=<stxAddress>&stacking=true"
```
## Step 1: Integrate libraries
Install the stacking, network, transactions libraries, and bn.js for large number handling:
```shell
npm install --save @stacks/stacking @stacks/network @stacks/transactions bn.js
```
-> See additional [Stacking library reference](https://github.com/blockstack/stacks.js/tree/master/packages/stacking)
## Step 2: Delegate STX tokens
To get started, delegate STX tokens as an account holder.
```js
import { getNonce } from '@stacks/transactions';
import { StacksTestnet, StacksMainnet } from '@stacks/network';
import { StackingClient } from '@stacks/stacking';
import BN from 'bn.js';
// for mainnet: const network = new StacksMainnet();
const network = new StacksTestnet();
// the stacker STX address
const address = 'ST3XKKN4RPV69NN1PHFDNX3TYKXT7XPC4N8KC1ARH';
const client = new StackingClient(address, network);
// how much to stack, in microSTX
const amountMicroStx = new BN(100000000000);
// STX address of the delegator
const delegateTo = 'ST2MCYPWTFMD2MGR5YY695EJG0G1R4J2BTJPRGM7H';
// burn height at which the delegation relationship should be revoked (optional)
const untilBurnBlockHeight = 5000;
// hash of bitcoin address that the delegator has to use to receive the pool's rewards (optional)
const poxAddress = undefined;
// private key of the account holder for transaction signing
const privateKey = 'd48f215481c16cbe6426f8e557df9b78895661971d71735126545abddcd5377001';
const delegetateResponse = await client.delegateStx({
amountMicroStx,
delegateTo,
untilBurnBlockHeight, // optional
poxAddress, // optional
privateKey,
});
// {
// txid: '0xf6e9dbf6a26c1b73a14738606cb2232375d1b440246e6bbc14a45b3a66618481',
// }
```
This method calls the [`delegate-stx`](/references/stacking-contract#delegate-stx) method of the Stacking contract. Note, that the amount can be higher or lower than the current account balance. Delegation does not yet lock the STX tokens, users can still transfer them.
-> To avoid handling private keys, it is recommended to use the [Stacks Wallet](https://www.hiro.so/wallet) to sign the delegation transaction
**Congratulations!** With the completion of this step, you successfully learnt how to use the Stacking library to delegate STX tokens as an account holder.
## Optional: Revoke delegation rights
Delegators will be able to Stack STX tokens on the account holder's behalf until either the set burn height is reached or the account holder revokes the rights.
To revoke delegation rights, the account holder can call the `revokeDelegatestx` method.
```js
const revokeResponse = await client.revokeDelegateStx(privateKey);
// {
// txid: '0xf6e9dbf6a26c1b73a14738606cb2232375d1b440246e6bbc14a45b3a66618481',
// }
```
This method calls the [`revoke-delegate-stx`](/references/stacking-contract#revoke-delegate-stx) method of the Stacking contract.
-> To avoid handling private keys, it is recommended to use the [Stacks Wallet](https://www.hiro.so/wallet) to sign the revoke transaction
## Step 3: Stack delegated STX tokens
With an established delegation relationship, the delegator can stack STX tokens on behalf of the account holder. This happens usually in a different client app than the delegation.
```js
// block height at which to stack
const burnBlockHeight = 2000;
// the delegator initiates a different client
const delegatorAddress = 'ST22X605P0QX2BJC3NXEENXDPFCNJPHE02DTX5V74';
// number cycles to stack
const cycles = 3;
// delegator private key for transaction signing
const delegatorPrivateKey = 'd48f215481c16cbe6426f8e557df9b78895661971d71735126545abddcd5377001';
// the BTC address for reward payouts; either to the delegator or to the BTC address set by the account holder
// must start with "1" or "3". Native Segwit (starts with "bc1") is not supported
const delegatorBtcAddress = 'msiYwJCvXEzjgq6hDwD9ueBka6MTfN962Z';
// if you call this method multiple times in the same block, you need to increase the nonce manually
let nonce = getNonce(delegatorAddress, network);
nonce = nonce.add(new BN(1));
const delegatorClient = new StackingClient(delegatorAddress, network);
const delegetateStackResponses = await delegatorClient.delegateStackStx({
stacker: address,
amountMicroStx,
poxAddress: delegatorBtcAddress,
burnBlockHeight,
cycles,
privateKey: delegatorPrivateKey,
nonce, // optional
});
// {
// txid: '0xf6e9dbf6a26c1b73a14738606cb2232375d1b440246e6bbc14a45b3a66618481',
// }
```
This function calls the [`delegate-stack-stx`](/references/stacking-contract#delegate-stack-stx) method of the Stacking contract to lock up the STX token from the account holder.
The delegator must call this method multiple times (for all stackers), until enough tokens are locked up to participate in Stacking. This is the first part of delegated stacking for the delegator.
-> Reward slots are assigned based on the number of STX tokens locked up for a specific Bitcoin reward address
## Step 4: Commit to Stacking
As soon as pooling completes (minimum STX token threshold reached), the delegator needs to confirm participation for the next cycle(s):
```js
// reward cycle id to commit to
const rewardCycle = 12;
const delegetateCommitResponse = await delegatorClient.stackAggregationCommit({
poxAddress: delegatorBtcAddress, // this must be the delegator bitcoin address
rewardCycle,
privateKey: delegatorPrivateKey,
});
// {
// txid: '0xf6e9dbf6a26c1b73a14738606cb2232375d1b440246e6bbc14a45b3a66618481',
// }
```
This method calls the [`stack-aggregation-commit`](/references/stacking-contract#stack-aggregation-commit) function of the Stacking contract. This call also includes locked Stacks from previous cycles. This is the second part of delegated stacking for the delegator.
This method has to be called once for each reward cycle, even if all account holders have already locked their Stacks for several cycles in a row. If no new account holders are added to the pool, then this method call can be made even several cycles before the actual rewards cycle.
Locking delegated Stacks together with a aggregation commits can be done several times before the cycle starts as long as the minimum increment amount of locked Stacks is met.
**Congratulations!** With the completion of this step, you successfully learnt how to use the Stacking library to ...
- Stack STX token on behalf of an account holder
- Commit to Stacking with all delegated STX tokens

332
src/pages/build-apps/guides/integrate-stacking.md

@ -1,332 +0,0 @@
---
title: Integrate Stacking
description: Learn how to add Stacking capabilities to your wallet or exchange
icon: TestnetIcon
experience: advanced
duration: 60 minutes
tags:
- tutorial
images:
sm: /images/pages/stacking-rounded.svg
---
> Try the [Stacks Wallet](https://www.hiro.so/wallet) to experience the Stacking flow as a token holder
## Introduction
In this tutorial, you'll learn how to integrate Stacking by interacting with the respective smart contract, as well as reading data from the Stacks blockchain.
This tutorial highlights the following functionality:
- Generate Stacks accounts
- Display stacking info
- Verify stacking eligibility
- Add stacking action
- Display stacking status
-> Alternatively to integration using JS libraries, you can use the [Rust CLI](https://gist.github.com/kantai/c261ca04114231f0f6a7ce34f0d2499b) or [JS CLI](/understand-stacks/stacking-using-CLI).
## Requirements
First, you'll need to understand the [Stacking mechanism](/understand-stacks/stacking).
You'll also need [NodeJS](https://nodejs.org/en/download/) `12.10.0` or higher to complete this tutorial. You can verify your installation by opening up your terminal and run the following command:
```bash
node --version
```
## Overview
In this tutorial, we'll implement the Stacking flow laid out in the [Stacking guide](/understand-stacks/stacking#stacking-flow).
## Step 1: Integrate libraries
Install the stacking, network, transactions libraries and bn.js for large number handling:
```shell
npm install --save @stacks/stacking @stacks/network @stacks/transactions bn.js
```
-> See additional [Stacking library reference](https://github.com/blockstack/stacks.js/tree/master/packages/stacking)
## Step 2: Generating an account and initialization
To get started, let's create a new, random Stacks 2.0 account:
```js
import {
makeRandomPrivKey,
privateKeyToString,
getAddressFromPrivateKey,
TransactionVersion,
} from '@stacks/transactions';
import { StackingClient } from '@stacks/stacking';
import { StacksTestnet, StacksMainnet } from '@stacks/network';
import BN from 'bn.js';
// generate random key or use an existing key
const privateKey = privateKeyToString(makeRandomPrivKey());
// get Stacks address
// for mainnet, remove the TransactionVersion
const stxAddress = getAddressFromPrivateKey(privateKey, TransactionVersion.Testnet);
// instantiate the Stacker class for testnet
// for mainnet, use `new StacksMainnet()`
const client = new StackingClient(stxAddress, new StacksTestnet());
```
-> Review the [accounts guide](/understand-stacks/accounts) for more details
## Step 3: Display stacking info
In order to inform users about the upcoming reward cycle, we can use the following methods to obtain information for Stacking:
With the obtained PoX info, you can present whether Stacking has been executed in the next cycle, when the next cycle begins, the duration of a cycle, and the minimum microstacks required to participate:
```js
// will Stacking be executed in the next cycle?
const stackingEnabledNextCycle = await client.isStackingEnabledNextCycle();
// true or false
// how long (in seconds) is a Stacking cycle?
const cycleDuration = await client.getCycleDuration();
// 120
// how much time is left (in seconds) until the next cycle begins?
const secondsUntilNextCycle = await client.getSecondsUntilNextCycle();
// 600000
```
-> Note: cycle duration and participation thresholds will differ between mainnet and testnet
You can also retrieve the raw PoX and core information using the methods below if required:
```js
const poxInfo = await client.getPoxInfo();
// poxInfo:
// {
// contract_id: 'ST000000000000000000002AMW42H.pox',
// first_burnchain_block_height: 0,
// min_amount_ustx: 83335083333333,
// prepare_cycle_length: 30,
// rejection_fraction: 3333333333333333,
// reward_cycle_id: 17,
// reward_cycle_length: 120,
// rejection_votes_left_required: 0,
// total_liquid_supply_ustx: 40000840000000000
// }
const coreInfo = await client.getCoreInfo();
// coreInfo:
// {
// peer_version: 385875968,
// pox_consensus: 'bb88a6e6e65fa7c974d3f6e91a941d05cc3dff8e',
// burn_block_height: 2133,
// stable_pox_consensus: '2284451c3e623237def1f8caed1c11fa46b6f0cc',
// stable_burn_block_height: 2132,
// server_version: 'blockstack-core 0.0.1 => 23.0.0.0 (HEAD:a4deb7a+, release build, linux [x86_64])',
// network_id: 2147483648,
// parent_network_id: 3669344250,
// stacks_tip_height: 1797,
// stacks_tip: '016df36c6a154cb6114c469a28cc0ce8b415a7af0527f13f15e66e27aa480f94',
// stacks_tip_consensus_hash: 'bb88a6e6e65fa7c974d3f6e91a941d05cc3dff8e',
// unanchored_tip: '6b93d2c62fc07cf44302d4928211944d2debf476e5c71fb725fb298a037323cc',
// exit_at_block_height: null
// }
const targetBlocktime = await client.getTargetBlockTime();
// targetBlocktime:
// 120
```
Users need to have sufficient Stacks (STX) tokens to participate in Stacking. This can be verified easily:
```js
const hasMinStxAmount = await client.hasMinimumStx();
// true or false
```
For testing purposes, you can use the faucet to obtain testnet STX tokens. Replace `<stxAddress>` below with your address:
```shell
curl -XPOST "https://stacks-node-api.testnet.stacks.co/extended/v1/faucets/stx?address=<stxAddress>&stacking=true"
```
You'll have to wait a few minutes for the transaction to complete.
Users can select how many cycles they would like to participate in. To help with that decision, the unlocking time can be estimated:
```js
// this would be provided by the user
let numberOfCycles = 3;
// the projected datetime for the unlocking of tokens
const unlockingAt = new Date(new Date().getTime() + secondsUntilNextCycle);
unlockingAt.setSeconds(unlockingAt.getSeconds() + cycleDuration * numberOfCycles);
```
## Step 4: Verify stacking eligibility
At this point, your app shows Stacking details. If Stacking will be executed and the user has enough funds, the user should be asked to provide input for the amount of microstacks to lockup and a Bitcoin address to receive the pay out rewards.
With this input, and the data from previous steps, we can determine the eligibility for the next reward cycle:
```js
// user supplied parameters
// BTC address must start with "1" or "3". Native Segwit (starts with "bc1") is not supported
let btcAddress = '1Xik14zRm29UsyS6DjhYg4iZeZqsDa8D3';
let numberOfCycles = 3;
const stackingEligibility = await client.canStack({
poxAddress: btcAddress,
cycles: numberOfCycles,
});
// stackingEligibility:
// {
// eligible: false,
// reason: 'ERR_STACKING_INVALID_LOCK_PERIOD',
// }
```
-> Note that the eligibility check assumes the user will be stacking the maximum balance available in the account.
-> The eligibility check is a read-only function call to the PoX smart contract which does not require broadcasting a transaction
If the user is eligible, the stacking action should be enabled on the UI. If not, the respective error message should be shown to the user.
## Step 5: Lock STX to stack
Next, the Stacking action should be executed.
```js
// set the amount to lock in microstacks
const amountMicroStx = new BN(100000000000);
// set the burnchain (BTC) block for stacking lock to start
// you can find the current burnchain block height from coreInfo above
// and adding 3 blocks to provide a buffer for transaction to confirm
const burnBlockHeight = 2133 + 3;
// execute the stacking action by signing and broadcasting a transaction to the network
client
.stack({
amountMicroStx,
poxAddress: btcAddress,
cycles: numberOfCycles,
privateKey,
burnBlockHeight,
})
.then(response => {
// If successful, stackingResults will contain the txid for the Stacking transaction
// otherwise an error will be returned
if (response.hasOwnProperty('error')) {
console.log(response.error);
throw new Error('Stacking transaction failed');
} else {
console.log(`txid: ${response}`);
// txid: f6e9dbf6a26c1b73a14738606cb2232375d1b440246e6bbc14a45b3a66618481
return response;
}
});
```
The transaction completion will take several minutes. Only one stacking transaction from each account/address is active at any time. Multiple/concurrent stacking actions from the same account will fail.
## Step 6: Confirm lock-up
The new transaction will not be completed immediately. It'll stay in the `pending` status for a few minutes. We need to poll the status and wait until the transaction status changes to `success`. We can use the [Stacks Blockchain API client library](/understand-stacks/stacks-blockchain-api#javascript-client-library) to check transaction status.
```js
const { TransactionsApi } = require('@stacks/blockchain-api-client');
const tx = new TransactionsApi(apiConfig);
const waitForTransactionSuccess = txId =>
new Promise((resolve, reject) => {
const pollingInterval = 3000;
const intervalID = setInterval(async () => {
const resp = await tx.getTransactionById({ txId });
if (resp.tx_status === 'success') {
// stop polling
clearInterval(intervalID);
// update UI to display stacking status
return resolve(resp);
}
}, pollingInterval);
});
// note: txId should be defined previously
const resp = await waitForTransactionSuccess(txId);
```
-> More details on the lifecycle of transactions can be found in the [transactions guide](/understand-stacks/transactions#lifecycle)
Alternatively to the polling, the Stacks Blockchain API client library offers WebSockets. WebSockets can be used to subscribe to specific updates, like transaction status changes. Here is an example:
```js
const client = await connectWebSocketClient('ws://stacks-node-api.blockstack.org/');
// note: txId should be defined previously
const sub = await client.subscribeAddressTransactions(txId, event => {
console.log(event);
// update UI to display stacking status
});
await sub.unsubscribe();
```
## Step 6: Display Stacking status
With the completed transactions, Stacks tokens are locked up for the lockup duration. During that time, your application can display the following details: unlocking time, amount of Stacks locked, and bitcoin address used for rewards.
```js
const stackingStatus = await client.getStatus();
// If stacking is active for the account, you will receive the stacking details
// otherwise an error will be thrown
// stackingStatus:
// {
// stacked: true,
// details: {
// amount_microstx: '80000000000000',
// first_reward_cycle: 18,
// lock_period: 10,
// burnchain_unlock_height: 3020,
// pox_address: {
// version: '00',
// hashbytes: '05cf52a44bf3e6829b4f8c221cc675355bf83b7d'
// }
// }
// }
```
-> Note that the `pox_address` property is the PoX contract's internal representation of the reward BTC address.
To display the unlocking time, you need to use the `firstRewardCycle` and the `lockPeriod` fields.
**Congratulations!** With the completion of this step, you successfully learnt how to ...
- Generate Stacks accounts
- Display stacking info
- Verify stacking eligibility
- Add stacking action
- Display stacking status
## Optional: Rewards
Currently, the Stacking library does not provide methods to get the paid rewards for a set address. However, the [Stacks Blockchain API exposes endpoints](https://blockstack.github.io/stacks-blockchain-api/#tag/Burnchain) to get more details.
As an example, if you want to get the rewards paid to `btcAddress`, you can make the following API call:
```shell
# for mainnet, replace `testnet` with `mainnet`
curl 'https://stacks-node-api.testnet.stacks.co/extended/v1/burnchain/rewards/<btcAddress>'
```

455
src/pages/build-apps/guides/transaction-signing.md

@ -1,455 +0,0 @@
---
title: Transaction signing
description: Prompt users to sign and broadcast transactions to the Stacks blockchain
images:
large: /images/transaction-signing.svg
sm: /images/transaction-signing.svg
---
## Introduction
This guide explains how to prompt users to sign [transactions](/understand-stacks/transactions) and broadcast them to the Stacks blockchain by implementing the [`connect`](https://github.com/blockstack/ux/tree/master/packages/connect#stacksconnect) package of Stacks.js.
Transaction signing provides a way for users execute [Clarity smart contracts](/write-smart-contracts/overview) that are relevant to your app then handle the result as appropriate.
Users can sign transactions that exchange fungible or non-fungible tokens with upfront guarantees that help them retain control over their digital assets.
There are three types of transactions:
1. STX transfer
2. Contract deployment
3. Contract execution
See the public registry tutorial for a concrete example of this functionality in practice.
[@page-reference]
| /build-apps/tutorials/public-registry
## Install dependency
~> In order to utilize the latest transaction signing with the Stacks Wallet, use version 5 of the `@stacks/connect` NPM package.
The following dependency must be installed:
```
npm install @stacks/connect@^5
```
## Initiate session
Users must authenticate to an app before the `connect` package will work to prompt them for signing and broadcasting transactions to the Stacks blockchain with an authenticator such as [the Stacks Wallet](https://www.hiro.so/wallet/install-web).
See the authentication guide before proceeding to integrate the following transaction signing capabilities in cases where `userSession.isUserSignedIn()` returns `true`.
[@page-reference]
| /build-apps/guides/authentication
## Get the user's Stacks address
After your user has authenticated with their Stacks Wallet, you can get their Stacks address from their `profile`.
```ts
const profile = userSession.loadUserData().profile.stxAddress;
const mainnetAddress = stxAddresses.mainnet;
// "SP2K5SJNTB6YP3VCTCBE8G35WZBPVN6TDMDJ96QAH"
const testnetAddress = stxAddresses.testnet;
// "ST2K5SJNTB6YP3VCTCBE8G35WZBPVN6TDMFEVESR6"
```
## Prompt to transfer STX
Call the `openSTXTransfer` function provided by the `connect` package to trigger the display of a transaction signing prompt for transferring STX:
```tsx
import { openSTXTransfer } from '@stacks/connect';
import { StacksTestnet } from '@stacks/network';
openSTXTransfer({
recipient: 'ST2EB9WEQNR9P0K28D2DC352TM75YG3K0GT7V13CV',
amount: '100',
memo: 'Reimbursement',
network: new StacksTestnet(), // for mainnet, `new StacksMainnet()`
appDetails: {
name: 'My App',
icon: window.location.origin + '/my-app-logo.svg',
},
onFinish: data => {
console.log('Stacks Transaction:', data.stacksTransaction);
console.log('Transaction ID:', data.txId);
console.log('Raw transaction:', data.txRaw);
},
});
```
Several parameters are available for calling `openSTXTransfer`. Here's the exact interface for them:
```tsx
interface STXTransferOptions {
recipient: string;
amount: string;
memo?: string;
network: StacksNetwork;
fee: number | string;
appDetails: {
name: string;
icon: string;
};
onFinish: (data: FinishedTxData) => void;
}
```
| parameter | type | required | description |
| ---------- | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| recipient | string | true | STX address for recipient of transfer |
| amount | string | true | Amount of microstacks (1 STX = 1,000,000 microstacks) to be transferred provided as string to prevent floating point errors. |
| appDetails | object | true | Dictionary that requires `name` and `icon` for app |
| onFinish | function | true | Callback executed by app when transaction has been signed and broadcasted. [Read more](#onFinish-option) |
| memo | string | false | Optional memo for inclusion with transaction |
| network | StacksNetwork | false | Specify the network that this transaction should be completed on. [Read more](#network-option) |
| fee | number \| string | false | Optional fee amount in microstacks (1 STX = 1,000,000 microstacks) for overwriting the wallet's default fee value. [Read more](https://forum.stacks.org/t/mempool-congestion-on-stacks-observations-and-next-steps-from-hiro/12325/5) |
## Prompt to deploy smart contract
Call the `openContractDeploy` function provided by the `connect` package to trigger the display of a transaction signing prompt for deploying a smart contract:
```tsx
import { openContractDeploy } from '@stacks/connect';
const codeBody = '(begin (print "hello, world"))';
openContractDeploy({
contractName: 'my-contract-name',
codeBody,
appDetails: {
name: 'My App',
icon: window.location.origin + '/my-app-logo.svg',
},
onFinish: data => {
console.log('Stacks Transaction:', data.stacksTransaction);
console.log('Transaction ID:', data.txId);
console.log('Raw transaction:', data.txRaw);
},
});
```
Several parameters are available for calling `openContractDeploy`. Here's the exact interface for them:
```tsx
interface ContractDeployOptions {
codeBody: string;
contractName: string;
network: StacksNetwork;
fee: number | string;
appDetails: {
name: string;
icon: string;
};
onFinish: (data: FinishedTxData) => void;
}
```
| parameter | type | required | description |
| ------------ | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| codeBody | string | true | Clarity source code for contract |
| contractName | string | true | Name for contract |
| appDetails | object | true | Dictionary that requires `name` and `icon` for app |
| onFinish | function | true | Callback executed by app when transaction has been signed and broadcasted. [Read more](#onFinish-option) |
| network | StacksNetwork | false | Specify the network that this transaction should be completed on. [Read more](#network-option) |
| fee | number \| string | false | Optional fee amount in microstacks (1 STX = 1,000,000 microstacks) for overwriting the wallet's default fee value. [Read more](https://forum.stacks.org/t/mempool-congestion-on-stacks-observations-and-next-steps-from-hiro/12325/5) |
-> Contracts will deploy to the Stacks address of the authenticated user.
## Prompt to execute contract
Call the `openContractCall` function provided by the `connect` package to trigger the display of a transaction signing prompt for executing a contract.
As an example, consider this simple Clarity contract:
```clarity
(define-public
(my-func
(arg-uint uint)
(arg-int int)
(arg-buff (buff 20))
(arg-string-ascii (string-ascii 20))
(arg-string-utf8 (string-utf8 20))
(arg-principal principal)
(arg-bool bool)
)
(ok u0)
)
```
To execute this function, invoke the `openContractCall` method. Use the `ClarityValue` types from `@stacks/transactions` to construct properly formatted arguments.
```tsx
import { openContractCall } from '@stacks/connect';
import {
uintCV,
intCV,
bufferCV,
stringAsciiCV,
stringUtf8CV,
standardPrincipalCV,
trueCV,
} from '@stacks/transactions';
const functionArgs = [
uintCV(1234),
intCV(-234),
bufferCV(Buffer.from('hello, world')),
stringAsciiCV('hey-ascii'),
stringUtf8CV('hey-utf8'),
standardPrincipalCV('STB44HYPYAT2BB2QE513NSP81HTMYWBJP02HPGK6'),
trueCV(),
];
const options = {
contractAddress: 'ST22T6ZS7HVWEMZHHFK77H4GTNDTWNPQAX8WZAKHJ',
contractName: 'my-contract',
functionName: 'my-func',
functionArgs,
appDetails: {
name: 'My App',
icon: window.location.origin + '/my-app-logo.svg',
},
onFinish: data => {
console.log('Stacks Transaction:', data.stacksTransaction);
console.log('Transaction ID:', data.txId);
console.log('Raw transaction:', data.txRaw);
},
};
await openContractCall(options);
```
Several parameters are available for calling `openContractCall`. Here's the exact interface for them:
```tsx
interface ContractCallOptions {
contractAddress: string;
functionName: string;
contractName: string;
functionArgs?: ClarityValue[];
network: StacksNetwork;
fee: number | string;
appDetails: {
name: string;
icon: string;
};
onFinish: (data: FinishedTxData) => void;
}
```
| parameter | type | required | description |
| --------------- | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --- |
| contractAddress | string | true | Stacks address to which contract is deployed |
| contractName | string | true | Name of contract to sign |
| functionName | string | true | Name of function for signing / execution, which needs to be a [public function](/references/language-functions#define-public). |
| functionArgs | `ClarityValue[]` | true | Arguments for calling the function. [Learn more about constructing clarity values](https://github.com/blockstack/stacks.js/tree/master/packages/transactions#constructing-clarity-values). Defaults to `[]`. |
| appDetails | object | true | Dictionary that requires `name` and `icon` for app |
| onFinish | function | true | Callback executed by app when transaction has been signed and broadcasted. [Read more](#onFinish-option) | |
| network | StacksNetwork | false | Specify the network that this transaction should be completed on. [Read more](#network-option) |
| fee | number \| string | false | Optional fee amount in microstacks (1 STX = 1,000,000 microstacks) for overwriting the wallet's default fee value. [Read more](https://forum.stacks.org/t/mempool-congestion-on-stacks-observations-and-next-steps-from-hiro/12325/5) |
## Getting the signed transaction back after completion {#onFinish-option}
Each transaction signing method from `@stacks/connect` allows you to specify an `onFinish` callback. This callback will be triggered after the user has successfully broadcasted their transaction. The transaction will be broadcasted, but it will be pending until it has been mined on the Stacks blockchain.
You can access some information about this transaction via the arguments passed to `onFinish`. Your callback will be fired with a single argument, which is an object with the following properties:
```ts
interface FinishedTxData {
stacksTransaction: StacksTransaction;
txRaw: string;
txId: string;
}
```
The `StacksTransaction` type comes from the [`@stacks/transactions`](https://github.com/blockstack/stacks.js/tree/master/packages/transactions) library.
The `txId` property can be used to provide a link to view the transaction in the explorer.
```ts
const onFinish = data => {
const explorerTransactionUrl = 'https://explorer.stacks.co/txid/${data.txId}';
console.log('View transaction in explorer:', explorerTransactionUrl);
};
```
## Specifying the network for a transaction {#network-option}
All of the methods included on this page accept a `network` option. By default, Connect uses a testnet network option. You can import a network configuration from the [`@stacks/network`](https://github.com/blockstack/stacks.js/tree/master/packages/network) package.
```ts
import { StacksTestnet, StacksMainnet } from '@stacks/network';
const testnet = new StacksTestnet();
const mainnet = new StacksMainnet();
// use this in your transaction signing methods:
openSTXTransfer({
network: mainnet,
// other relevant options
});
```
## Usage in React Apps
Import the `useConnect` from the [`connect-react`](https://github.com/blockstack/ux/tree/master/packages/connect-react) package to integrate transaction signing more seamlessly into React apps.
```
npm install @stacks/connect-react
```
Each transaction signing method is itself available as a function returned by `useConnect` though prefixed with `do` for consistency with React action naming standards:
- `openContractCall` as `doContractCall`
- `openSTXTransfer` as `doSTXTransfer`
- `openContractDeploy` as `doContractDeploy`
Use these functions with the same parameters as outlined above. However, you don't have to specify `appDetails` since they are detected automatically if `useConnect` has been used already [for authentication](/build-apps/guides/authentication#usage-in-react-apps).
```tsx
import { useConnect } from '@stacks/connect-react';
const MyComponent = () => {
const { doContractCall } = useConnect();
const onClick = async () => {
const options = {
/** See examples above */
};
await doContractCall(options);
};
return <span onClick={onClick}>Call my contract</span>;
};
```
## Request testnet STX from faucet
You may find it useful to request testnet STX from [the Stacks Explorer sandbox](https://explorer.stacks.co/sandbox?chain=testnet) while developing your app with the Stacks testnet.
## Transaction request / response payload
Under the hood, `@stacks/connect` will serialize and deserialize data between your app and the Stacks Wallet.
These payloads are tokens that conform to the [JSON Web Token (JWT) standard](https://tools.ietf.org/html/rfc7519) with additional support for the `secp256k1` curve used by Bitcoin and many other cryptocurrencies.
### Transaction Request Payload
When an application triggers an transaction from `@stacks/connect`, the options of that transaction are serialized into a `transactionRequest` payload. The `transactionRequest` is similar to the [authRequest](/build-apps/guides/authentication.md#authrequest-payload-schema) payload used for authentication.
The transaction request payload has the following schema, in addition to the standard JWT required fields:
```ts
interface TransactionRequest {
appDetails?: {
name: string;
icon: string;
};
// 1 = "allow", 2 = "deny".
postConditionMode?: PostConditionMode; // number
// Serialized version of post conditions
postConditions?: string[];
// JSON serialized version of `StacksNetwork`
// This allows the app to specify their default desired network.
// The user may switch networks before broadcasting their transaction.
network?: {
coreApiUrl: string;
chainID: ChainID; // number
};
// `AnchorMode` defined in `@stacks/transactions`
anchorMode?: AnchorMode; // number
// The desired default stacks address to sign with.
// There is no guarantee that the transaction is signed with this address;
stxAddress?: string;
txType: TransactionDetails; // see below
}
export enum TransactionTypes {
ContractCall = 'contract_call',
ContractDeploy = 'smart_contract',
STXTransfer = 'token_transfer',
}
interface ContractCallPayload extends TransactionRequest {
contractAddress: string;
contractName: string;
functionName: string;
// Serialized Clarity values to be used as arguments in the contract call
functionArgs: string[];
txType: TransactionTypes.ContractCall;
}
interface ContractDeployPayload extends TransactinRequest {
contractName: string;
// raw source code for this contract
codeBody: string;
txType: TransactionTypes.ContractDeploy;
}
interface StxTransferPayload extends TransactionRequest {
recipient: string;
// amount for this transaction, in microstacks
amount: string;
memo?: string;
txType: TransactionTypes.STXTransfer;
}
```
### Transaction Response payload
After the user signs and broadcasts a transaction, a `transactionResponse` payload is sent back to your app.
```ts
interface TransactionResponse {
txId: string;
// hex serialized version of this transaction
txRaw: string;
}
```
## StacksProvider injected variable
When users have the [Stacks Wallet for Web](https://www.hiro.so/wallet/install-web) browser extension installed, the extension will inject a global `StacksProvider` variable into the JavaScript context of your web application. This allows your JavaScript code to hook into the extension, and make authentication and transaction requests. `@stacks/connect` automatically detects and uses this global variable for you.
At the moment, only the Stacks Wallet for Web browser extension includes a `StacksProvider`, however, ideally more wallets (and mobile wallets) support this format, so that your app can be compatible with any Stacks Wallet that has functionality to embed web applications.
In your web application, you can check to see if the user has a compatible wallet installed by checking for the presence of `window.StacksProvider`.
Here is the interface for the `StacksProvider` variable.
```ts
interface StacksProvider {
/**
* Make a transaction request
*
* @param payload - a JSON web token representing a transaction request
*/
transactionRequest(payload: string): Promise<TransactionResponse>;
/**
* Make an authentication request
*
* @param payload - a JSON web token representing an auth request
*
* @returns an authResponse string in the form of a JSON web token
*/
authenticationRequest(payload: string): Promise<string>;
getProductInfo:
| undefined
| (() => {
version: string;
name: string;
meta?: {
tag?: string;
commit?: string;
[key: string]: any;
};
[key: string]: any;
});
}
```

129
src/pages/build-apps/indexing/collaboration.md

@ -1,129 +0,0 @@
---
title: Collaboration
description: Support private collaboration between multiple users with Radiks
---
## Introduction
A key feature of Radiks is support for private collaboration between multiple users. Supporting collaboration with
client-side encryption and user-owned storage can be complicated, but the patterns to implement it are generally the
same among most applications. Radiks supplies interfaces for collaboration, making it easy to build private,
collaborative apps.
You use the [`UserGroup`](https://github.com/blockstack/radiks/blob/master/src/models/user-group.ts) class to build a
collaborative group with Radiks. In this section, you learn about this class.
## Understand the UserGroup workflow
The key model behind a collaborative group is `UserGroup`. By default, it only has one attribute, `name`, which is
encrypted. You can subclass `UserGroup` with different attributes as needed.
The general workflow for creating a collaborative group that can share and edit encrypted models is as follows:
1. The admin of the group creates a new `UserGroup`.
This group acts as the 'hub' and controls the logic around inviting and removing users.
2. The admin invites one or more other users to a group:
- The admin specifies the username of the user they want to invite
- Radiks looks up the user's public key
- Radiks creates an 'invitation' that is encrypted with the user's public key, and contains information about the `UserGroup`
3. When the invited user 'activates' an invitation, they create a `GroupMembership`.
They use this membership instance to reference information (such as private keys and signing keys) related to the group.
As they participate in a group, the group's members can create and update models that are related to the group.
These models **must** contain a `userGroupId` attribute used to reference the group. This allows Radiks to know which
keys to use for encryption and signing.
When needed, the group admin can remove a user from a group. To remove a user from the group, the admin creates a
new private key for signing and encryption. Then, the admin updates the `GroupMembership` of all users _except_ the
user they just removed. This update-and-remove action is also known as rotating the key.
After a key is rotated, all new and updated models must use the new key for signing. Radiks-server validates all
group-related models to ensure that they're signed with the most up-to-date key.
## Work with a UserGroup
This section details the methods on the [`UserGroup`](https://github.com/blockstack/radiks/blob/master/src/models/user-group.ts)
class you can use to create, add members to, and query a group.
### Create a UserGroup
To create a `UserGroup`, you must import the class into your application from `radiks`:
```jsx
import { UserGroup } from 'radiks';
// ...
```
Calling `create` on a new `UserGroup` will create the group and activate an invitation for the group's creator.
```jsx
const group = new UserGroup({ name: 'My Group Name' });
await group.create();
```
A group's creator is also the group's admin.
### Invite users to become members
Use the `makeGroupMembership` method on a `UserGroup` instance to invite a user. The only argument passed to this
method is the user's `username`.
```jsx
import { UserGroup } from 'radiks';
const group = await UserGroup.findById(myGroupId);
const usernameToInvite = 'hankstoever.id';
const invitation = await group.makeGroupMembership(usernameToInvite);
console.log(invitation._id); // the ID used to later activate an invitation
```
#### Generic invitation
You can also create a generic invitation that any user can activate, if they are provided with randomly generated
secret key, which should be used to decrypt the invitation. The key is generated when the generic invitation
is being created.
```jsx
import { GenericGroupInvitation, UserGroup } from 'radiks';
const group = await UserGroup.findById(myGroupId);
// Creating generic invitation
const genericInvitation = await GenericGroupInvitation.makeGenericInvitation(group);
console.log(genericInvitation._id); // the ID used to later activate an invitation
console.log(genericInvitation.secretCode); // the secretCode used to later activate an invitation
```
### Accept an invitation
Use the `activate` method on a `GroupInvitation` instance to activate an invitation on behalf of a user:
```jsx
import { GroupInvitation, GenericGroupInvitation } from 'radiks';
// For user-specific invitation
const invitation = await GroupInvitation.findById(myInvitationID);
await invitation.activate();
// For generic invitation
const genericInvitation = await GenericGroupInvitation.findById(myInvitationID);
await genericInvitation.activate(mySecretCode);
```
## View all activated UserGroups for the current user
Call `UserGroup.myGroups` to fetch all groups that the current user is a member of:
```jsx
import { UserGroup } from 'radiks';
const groups = await UserGroup.myGroups();
```
## Find a UserGroup
Use the method `UserGroup.find(id)` when fetching a specific UserGroup. This method has extra boilerplate to handle
decrypting the model, because the private keys may need to be fetched from different models.
```jsx
const group = await UserGroup.find('my-id-here');
```

347
src/pages/build-apps/indexing/models.md

@ -1,347 +0,0 @@
---
title: Models
description: Model and query application data with Radiks
---
## Introduction
Radiks allows you to model your client data. You can then query this data and display it for a
user in multi-player applications. A social application where users want to see the comments of
other users is an example of a multi-player application. This page explains how to create a model
in your distributed application using Radiks.
## Overview of Model class extension
Stacks Radiks provides a `Model` class you should extend to easily create, save, and fetch models.
To create a model class, import the `Model` class from `radiks` into your application.
```jsx
import { Model, User } from 'radiks';
```
Then, create a class that extends this model, and provide a schema. Refer to
[the `Model` class](https://github.com/blockstack/radiks/blob/master/src/model.ts) in the
`radiks` repo to get an overview of the class functionality.
Your new class must define a static `className` property. This property is used when
storing and querying information. If you fail to add a `className`, Radiks defaults to the
actual model's class name (`foobar.ts`) and your application will behave unpredictably.
The example class code extends `Model` to create a class named `Todo`:
```jsx
import { Model, User } from 'radiks';
class Todo extends Model {
static className = 'Todo';
static schema = {
// all fields are encrypted by default
title: String,
completed: Boolean,
};
}
// after authentication:
const todo = new Todo({ title: 'Use Radiks in an app' });
await todo.save();
todo.update({
completed: true,
});
await todo.save();
const incompleteTodos = await Todo.fetchOwnList({
// fetch todos that this user created
completed: false,
});
console.log(incompleteTodos.length); // 0
```
## How to create your own Model
The following sections guide you through the steps in defining your own class.
### Define a class schema
Every class must have a static `schema` property which defines the attributes of a model
using field/value pairs, for example:
```jsx
class Todo extends Model {
static className = 'Todo';
static schema = {
// all fields are encrypted by default
title: String,
completed: Boolean,
};
}
```
The `key` in this object is the field name and the value, for example, `String`, `Boolean`, or
`Number`. In this case, the `title` is a `String` field. Alternatively, you can pass options instead of a type.
To define options, pass an object, with a mandatory `type` field. The only supported option right
now is `decrypted`. This defaults to `false`, meaning the field is encrypted before the data is stored
publicly. If you specify `true`, then the field is not encrypted.
Storing unencrypted fields is useful if you want to be able to query the field when fetching data.
A good use-case for storing decrypted fields is to store a `foreignId` that references a different model,
for a "belongs-to" type of relationship.
**Never add the `decrypted` option to fields that contain sensitive user data.** Stacks data is
stored in a decentralized Gaia storage and anyone can read the user's data. That's why encrypting it
is so important. If you want to filter sensitive data, then you should do it on the client-side,
after decrypting it.
### Include defaults
You may want to include an optional `defaults` static property for some field values. For example,
in the class below, the `likesDogs` field is a `Boolean`, and the default is `true`.
```jsx
import { Model } from 'radiks';
class Person extends Model {
static className = 'Person';
static schema = {
name: String,
age: Number,
isHuman: Boolean,
likesDogs: {
type: Boolean,
decrypted: true, // all users will know if this record likes dogs
},
};
static defaults = {
likesDogs: true,
};
}
```
If you wanted to add a default for `isHuman`, you would simply add it to the `defaults` as well.
Separate each field with a comma.
### Extend the User model
Radiks also supplies [a default `User` model](https://github.com/blockstack/radiks/blob/master/src/models/user.ts).
You can also extend this model to add your own attributes.
```jsx
import { User } from 'radiks';
// For example I want to add a public name on my user model
class MyAppUserModel extends User {
static schema = {
...User.schema,
displayName: {
type: String,
decrypted: true,
},
};
}
```
The default `User` model defines a `username`, but you can add a `displayName` to allow the user to
set unique name in your app.
## Use a model you have defined
In this section, you learn how to use a model you have defined.
### About the \_id attribute
All model instances have an `_id` attribute. An `_id` is used as a primary key when storing data and is used
for fetching a model. Radiks also creates a `createdAt` and `updatedAt` property when creating and saving models.
If, when constructing a model's instance, you don't pass an `_id`, Radiks creates an `_id` for you automatically.
This automatically created id uses the [`uuid/v4`](https://github.com/kelektiv/node-uuid) format. This
automatic `_id` is returned by the constructor.
### Construct a model instance
To create an instance, pass some attributes to the constructor of that class:
```jsx
const person = new Person({
name: 'Hank',
isHuman: false,
likesDogs: false, // just an example, I love dogs
});
```
### Fetch an instance
To fetch an existing instance of an instance, you need the instance's `id` property. Then, call the `findById()`
method or the `fetch()` method, which returns a promise.
```jsx
const person = await Person.findById('404eab3a-6ddc-4ba6-afe8-1c3fff464d44');
```
After calling these methods, Radiks automatically decrypts all encrypted fields.
### Access attributes
Other than `id`, all attributes are stored in an `attrs` property on the instance.
```jsx
const { name, likesDogs } = person.attrs;
console.log(`Does ${name} like dogs?`, likesDogs);
```
### Update attributes
To quickly update multiple attributes of an instance, pass those attributes to the `update` method.
```jsx
const newAttributes = {
likesDogs: false,
age: 30,
};
person.update(newAttributes);
```
Important, calling `update` does **not** save the instance.
### Save changes
To save an instance to Gaia and MongoDB, call the `save()` method, which returns a promise. This method encrypts
all attributes that do not have the `decrypted` option in their schema. Then, it saves a JSON representation of
the model in Gaia, as well as in the MongoDB.
```jsx
await person.save();
```
### Delete an instance
To delete an instance, just call the `destroy` method on it.
```jsx
await person.destroy();
```
## Query a model
To fetch multiple records that match a certain query, use the class's `fetchList()` function. This method creates an
HTTP query to Radiks-server, which then queries the underlying database. Radiks-server uses the `query-to-mongo`
package to turn an HTTP query into a MongoDB query.
Here are some examples:
```jsx
const dogHaters = await Person.fetchList({ likesDogs: false });
```
Or, imagine a `Task` model with a `name`, a boolean for `completed`, and an `order` attribute.
```jsx
class Task extends Model {
static className = 'Task';
static schema = {
name: String,
completed: {
type: Boolean,
decrypted: true,
},
order: {
type: Number,
decrypted: true,
},
};
}
const tasks = await Task.fetchList({
completed: false,
sort: '-order',
});
```
You can read the [`query-to-mongo`](https://github.com/pbatey/query-to-mongo) package documentation to learn how
to do complex querying, sorting, limiting, and so forth.
## Count models
You can also get a model's `count` record directly.
```jsx
const dogHaters = await Person.count({ likesDogs: false });
// dogHaters is the count number
```
## Fetch models created by the current user
Use the `fetchOwnList` method to find instances that were created by the current user. By using this method,
you can preserve privacy, because Radiks uses a `signingKey` that only the current user knows.
```jsx
const tasks = await Task.fetchOwnList({
completed: false,
});
```
## Manage relational data
It is common for applications to have multiple different models, where some reference another. For example,
imagine a task-tracking application where a user has multiple projects, and each project has multiple tasks.
Here's what those models might look like:
```jsx
class Project extends Model {
static className = 'Project';
static schema = { name: String }
}
class Task extends Model {
static className = 'Task';
static schema = {
name: String,
projectId: {
type: String,
decrypted: true,
}
completed: Boolean
}
}
```
Whenever you save a task, you should save a reference to the project it's in:
```jsx
const task = new Task({
name: 'Improve radiks documentation',
projectId: project._id,
});
await task.save();
```
Then, later you'll want to fetch all tasks for a certain project:
```jsx
const tasks = await Task.fetchList({
projectId: project._id,
});
```
Radiks lets you define an `afterFetch` method. Use this method to automatically fetch child records when you
fetch the parent instance.
```jsx
class Project extends Model {
static className = 'Project';
static schema = { name: String };
async afterFetch() {
this.tasks = await Task.fetchList({
projectId: this.id,
});
}
}
const project = await Project.findById('some-id-here');
// will already have fetched and decrypted all related tasks
console.log(project.tasks);
```

93
src/pages/build-apps/indexing/overview.md

@ -1,93 +0,0 @@
---
title: Overview
description: Build multi-player apps that index, store, and query user data with Radiks
images:
large: /images/pages/radiks.svg
sm: /images/pages/radiks-sm.svg
---
## Introduction
The Stacks Radiks feature enables Stacks decentralized applications (DApps) to index and store across data
belonging to multiple users. Radiks works with Stacks's Gaia Storage System. Using Radiks, you can build
multi-player DApps that:
- index, store, and query application data
- query a user's publicly saved data
- display real-time updates that reflect in progress changes
- support collaboration among sets of users
Want to jump right in and start integrating indexing into your app? [Try this tutorial](/build-apps/tutorials/indexing).
## Why use Radiks?
Many applications serve data that users create to share publicly with others. Facebook, Twitter, and Instagram are
examples of such applications. Decentralized applications that want to create comparable multi-user experiences must
ensure that anything a user creates for public sharing is still under control of the creator in the user's Gaia storage.
For example, if Twitter wanted to be a decentralized application while still having many different users creating their
own tweets, those tweets would be stored in each user's own Gaia storage. In such a situation, Twitter still needs a way
to keep track of everyone's tweets, display those tweets in user timelines, and perform searches across the platform.
Radiks exists to support these kinds of scenarios. It allows applications to query across multiple user data using
complicated queries like text search, joins, and filters.
Radiks allows applications to query data in a performant and flexible way. Each application that wishes to index and
query in this way requires its own Radiks server.
## How Radiks works with application data
Radiks consists of a database, a pre-built server, and a client. A developer adds the Radiks library to their application.
With this library, developers model their application data. The model defines an application data schema for the Radiks
server. Then, you can use calls to write and query data that use that schema. Whenever an application saves or updates
data on behalf of a user, Radiks follows this flow:
1. Encrypts private user data on the client-side.
2. Saves a raw JSON of this encrypted data in the user's Gaia storage.
3. Stores the encrypted data on the Radiks server.
Radiks can store both public and sensitive, non-public data since all data is encrypted by default before it leaves the
client. Your application can query Radiks for public data and then decrypt the sensitive information on the client.
Radiks servers can only return queries for unencrypted data.
## How Radiks authorizes writes
Radiks must ensure that the user is writing to their own data. To ensure this, Radiks creates and manages _signing keys_.
These keys sign all writes that a user performs. Radiks server-validates all signatures before performing a write. This
guarantees that a user is not able to overwrite another user's data.
A Radiks server is also built to support writes in a collaborative but private situation. For example, consider a
collaborative document editing application, where users can create organizations and invite users to that organization.
All users in that organization should have read and write privileges to the organization data. Thus, these organizations
will have a single shared key that is used to sign and encrypt data.
When an organization administrator needs to remove a user from the group, they are expected to revoke the previous key
and create a new one. Radiks is aware of these relationships, and will only support writes that are signed with the
currently active key related to an organization.
## Is Radiks decentralized
Although Radiks applications rely on a centrally hosted database, an application using Radiks remains fundamentally
decentralized. A DApp that uses Radiks has these characteristics.
### Built on decentralized authentication
Radiks is deeply tied to Stacks authentication, which uses a blockchain and Gaia to give you full control over
your user data.
### No data lock-in
All user data is first stored in Gaia before encrypted with the user's keys and stored in Radiks. This process means
the user still controls their data for as long as they need to. If the application's Radiks server shuts down, the
user can still access their data. And, without the user's signing keys, an application cannot decrypt the user's data.
Users may also backup or migrate their application data from Gaia.
### Censorship resistance
All data is also stored in Gaia; no third-party can revoke access to this data.
### Maximum privacy
All data is encrypted on the client-side before being stored anywhere using Stacks authorization. The application
host cannot inspect, sell, or use user data in any way that a user doesn't explicitly authorize.
If you are not familiar with Gaia, see [read the Gaia documentation](/build-apps/references/gaia).

163
src/pages/build-apps/indexing/server.md

@ -1,163 +0,0 @@
---
title: Server
description: Tips and tricks for working with Radiks server
---
## Access the MongoDB collection
Radiks-server keeps all models inside of a collection. You can use the `getDB` function to access this collection from inside your application.
```jsx
const { getDB } = require('radiks-server');
const mongo = await getDB(MONGODB_URL);
```
[See the MongoDB Collection reference](https://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html) for documentation about how you can interact with this collection.
## Run a custom Radiks-server
If you're using an [express.js](https://expressjs.com/) server to run your application, it's probably easiest to use the Radiks-server middleware. This way, you won't have to run a separate application server and Radiks server.
Radiks-server includes an easy-to-use middleware that you can include in your application:
```jsx
const express = require('express');
const { setup } = require('radiks-server');
const app = express();
setup().then(RadiksController => {
app.use('/radiks', RadiksController);
});
```
The `setup` method returns a promise, and that promise resolves to the actual middleware that your server can use. This is because it first connects to MongoDB, and then sets up the middleware with that database connection.
The `setup` function accepts an `options` object as the first argument. If you aren't using environment variables, you can explicitly pass in a MongoDB URL here:
```jsx
setup({
mongoDBUrl: 'mongodb://localhost:27017/my-custom-database',
}).then(RadiksController => {
app.use('/radiks', RadiksController);
});
```
Currently, only the `mongoDBUrl` option is supported.
## Migrate from Firebase (or anywhere else)
Migrating data from Firebase to Radiks-server is simple and painless. You can create a script file to fetch all the Firebase data using their API. Then, you can use your `MONGOD_URI` config to use the `mongodb` npm package.
```js
// Script for transfering users from Firebase to Radiks-server
const { getDB } = require('radiks-server');
const { mongoURI } = require('......'); // How you import/require your mongoURI is up to you
const migrate = async () => {
// `mongo` is a reference to the MongoDB collection that radiks-server uses.
// You can add or edit or update data as necessary.
const mongo = await getDB(mongoURI);
/**
* Call code to get your users from firebase
* const users = await getUsersFromFirebase();
* OR grab the Firebase JSON file and set users to that value
* How you saved your user data will probably be different than the example below
*/
const users = {
'-LV1HAQToANRvhysSClr': {
blockstackId: '1N1DzKgizU4rCEaxAU21EgMaHGB5hprcBM',
username: 'kkomaz.id',
},
};
const usersToInsert = Object.values(users).map(user => {
const { username } = user;
const doc = {
username,
_id: username,
radiksType: 'BlockstackUser',
};
const op = {
updateOne: {
filter: {
_id: username,
},
update: {
$setOnInsert: doc,
},
upsert: true,
},
};
return op;
});
await mongo.bulkWrite(usersToInsert);
};
migrate()
.then(() => {
console.log('Done!');
process.exit();
})
.catch(error => {
console.error(error);
process.exit();
});
```
## Streaming real-time changes
`Radiks-server` provides a websocket endpoint that will stream all new inserts and updates that it sees on the server. `Radiks` provides a helpful interface to poll for these changes on a model-by-model basis. For example, if you had a `Task` model, you could get real-time updates on all your tasks. This is especially useful in collaborative environments. As soon as a collaborator updates a model, you can get the change in real-time, and update your views accordingly.
Before you can implement the websocket function, you must configure your `Radiks-Server` with [express-ws](https://github.com/HenningM/express-ws)
```jsx
const app = express();
expressWS(app);
```
Here's an example for how to use the API:
```jsx
import Task from './models/task';
const streamCallback = task => {
// this callback will be called whenever a task is created or updated.
// `task` is an instance of `Task`, and all methods are defined on it.
// If the user has the necessary keys to decrypt encrypted fields on the model,
// the model will be decrypted before the callback is invoked.
if (task.projectId === myAppsCurrentProjectPageId) {
// update your view here with this task
}
};
Task.addStreamListener(streamCallback);
// later on, you might want to remove the stream listener (if the
// user changes pages, for example). When calling `removeStreamListener`,
// you MUST provide the exact same callback that you used with `addStreamListener`.
Task.removeStreamListener(streamCallback);
```
## Saving centralized user-related data
Sometimes, you need to save some data on behalf of the user that only the server is able to see. A common use case for this is when you want to notify a user, and you need to store, for example, their email. This should be updatable only by the user, and only the server (or that user) should be able to see it. Radiks provides the `Central` API to handle this:
```jsx
import { Central } from 'radiks';
const key = 'UserSettings';
const value = { email: 'myemail@example.com' };
await Central.save(key, value);
const result = await Central.get(key);
console.log(result); // { email: 'myemail@example.com' }
```

19
src/pages/build-apps/overview.md

@ -8,13 +8,11 @@ images:
## Introduction
Apps built with the Stacks blockchain give users control over their digital identities, assets and data.
Apps built with the Stacks blockchain give users control over their digital identities, assets, and data.
Unlike most cloud-based apps, they are "decentralized" since they don't depend on any centralized platform, server or database to function. Rather, they use the Stacks blockchain to authenticate users and facilitate read and write requests for them without any single point of failure or trust.
Unlike most cloud-based apps, they are "decentralized" since they don't depend on any centralized platform, server, or database to function. Rather, they use the Stacks blockchain to authenticate users and facilitate read and write requests for them without any single point of failure or trust.
This page provides information on how to build such apps using [Stacks.js](https://github.com/blockstack/stacks.js) and other libraries that make integration of the Stacks blockchain easy for front-end developers.
Three main integrations are available:
Stacks provides three main functions for building apps:
- **Authentication**: Register and sign users in with identities on the Stacks blockchain
- **Transaction signing**: Prompt users to sign and broadcast transactions to the Stacks blockchain
@ -22,14 +20,9 @@ Three main integrations are available:
All three of these integrations can be used together to create powerful new user experiences that rival or exceed those of traditional apps—all while protecting your users' digital rights.
While integration is possible for any type of app, most of the resources available here are for web developers experienced with JavaScript.
## Guides
[@page-reference | grid]
| /build-apps/guides/authentication, /build-apps/guides/transaction-signing, /build-apps/guides/data-storage
While integration is possible for any type of app, most of the resources available here are for web developers experienced with JavaScript. See [Hiro developer docs](https://docs.hiro.so) for more information on the available app development libraries for Stacks.
## Example apps
## References
[@page-reference | grid]
| /build-apps/examples/todos, /build-apps/examples/heystack, /build-apps/examples/public-registry, /build-apps/examples/angular
| /build-apps/references/authentication, /build-apps/references/bns, /build-apps/references/gaia

76
src/pages/build-apps/references/authentication.md

@ -0,0 +1,76 @@
---
title: Authentication
description: Register and sign in users with identities on the Stacks blockchain
images:
large: /images/pages/write-smart-contracts.svg
sm: /images/pages/write-smart-contracts-sm.svg
---
## Introduction
This guide explains how authentication is performed on the Stacks blockchain.
Authentication provides a way for users to identify themselves to an app while retaining complete control over their credentials and personal details. It can be integrated alone or used in conjunction with [transaction signing](/build-apps/guides/transaction-signing) and [data storage](/build-apps/guides/data-storage), for which it is a prerequisite.
Users who register for your app can subsequently authenticate to any other app with support for the [Blockchain Naming System](/build-apps/references/bns) and vice versa.
## How it works
The authentication flow with Stacks is similar to the typical client-server flow used by centralized sign in services (for example, OAuth). However, with Stacks the authentication flow happens entirely client-side.
An app and authenticator, such as [the Stacks Wallet](https://www.hiro.so/wallet/install-web), communicate during the authentication flow by passing back and forth two tokens. The requesting app sends the authenticator an `authRequest` token. Once a user approves authentication, the authenticator responds to the app with an `authResponse` token.
These tokens are are based on [a JSON Web Token (JWT) standard](https://tools.ietf.org/html/rfc7519) with additional support for the `secp256k1` curve used by Bitcoin and many other cryptocurrencies. They are passed via URL query strings.
When a user chooses to authenticate an app, it sends the `authRequest` token to the authenticator via a URL query string with an equally named parameter:
`https://wallet.hiro.so/...?authRequest=j902120cn829n1jnvoa...`
When the authenticator receives the request, it generates an `authResponse` token for the app using an _ephemeral transit key_ . The ephemeral transit key is just used for the particular instance of the app, in this case, to sign the `authRequest`.
The app stores the ephemeral transit key during request generation. The public portion of the transit key is passed in the `authRequest` token. The authenticator uses the public portion of the key to encrypt an _app private key_ which is returned via the `authResponse`.
The authenticator generates the app private key from the user's _identity address private key_ and the app's domain. The app private key serves three functions:
1. It is used to create credentials that give the app access to a storage bucket in the user's Gaia hub
2. It is used in the end-to-end encryption of files stored for the app in the user's Gaia storage.
3. It serves as a cryptographic secret that apps can use to perform other cryptographic functions.
Finally, the app private key is deterministic, meaning that the same private key will always be generated for a given Stacks address and domain.
The first two of these functions are particularly relevant to [data storage with Stacks.js](/build-apps/guides/data-storage).
## Key pairs
Authentication with Stacks makes extensive use of public key cryptography generally and ECDSA with the `secp256k1` curve in particular.
The following sections describe the three public-private key pairs used, including how they're generated, where they're used and to whom private keys are disclosed.
### Transit private key
The transit private is an ephemeral key that is used to encrypt secrets that
need to be passed from the authenticator to the app during the
authentication process. It is randomly generated by the app at the beginning of
the authentication response.
The public key that corresponds to the transit private key is stored in a single
element array in the `public_keys` key of the authentication request token. The
authenticator encrypts secret data such as the app private key using this
public key and sends it back to the app when the user signs in to the app. The
transit private key signs the app authentication request.
### Identity address private key
The identity address private key is derived from the user's keychain phrase and
is the private key of the Stacks username that the user chooses to use to sign in
to the app. It is a secret owned by the user and never leaves the user's
instance of the authenticator.
This private key signs the authentication response token for an app to indicate that the user approves sign in to that app.
### App private key
The app private key is an app-specific private key that is generated from the
user's identity address private key using the `domain_name` as input.
The app private key is securely shared with the app on each authentication, encrypted by the authenticator with the transit public key. Because the transit key is only stored on the client side, this prevents a man-in-the-middle attack where a server or internet provider could potentially snoop on the app private key.

3
src/pages/build-apps/references/bns.md

@ -1,6 +1,9 @@
---
title: Blockchain Naming System
description: Binds Stacks usernames to off-chain state
images:
large: /images/nodes.svg
sm: /images/nodes.svg
---
Blockchain Naming System (BNS) is a network system that binds Stacks usernames

18
src/pages/index.md

@ -1,8 +1,10 @@
---
title: Stacks documentation
description: Write Clarity smart contracts, build apps, and starting mining with the Stacks blockchain
description: Learn about Stacks mining, the STX token, and the Clarity smart contract language
---
-> Content related to developer tools and app development has recently moved to [docs.hiro.so](https://docs.hiro.so/). For more information on the content move, see [this post](https://forum.stacks.org/t/the-evolution-of-the-stacks-documentation-and-a-new-hiro-docs-site/12343) on the Stacks forum.
## Understand Stacks
[@page-reference | grid]
@ -11,19 +13,9 @@ description: Write Clarity smart contracts, build apps, and starting mining with
## Write smart contracts
[@page-reference | grid]
| /write-smart-contracts/overview, /write-smart-contracts/hello-world-tutorial, /write-smart-contracts/tokens, /write-smart-contracts/nft-tutorial
## Build apps
[@page-reference | grid]
| /build-apps/guides/authentication, /build-apps/guides/transaction-signing, /build-apps/guides/data-storage
| /write-smart-contracts/overview, /write-smart-contracts/tokens
## Start mining
[@page-reference | grid]
| /start-mining/mainnet, /start-mining/testnet, /understand-stacks/running-mainnet-node, /understand-stacks/running-testnet-node, /understand-stacks/running-regtest-node
## Ecosystem
[@page-reference | grid-small]
| /ecosystem/overview, /ecosystem/stacks-token, /ecosystem/contributing
| /start-mining/mainnet, /start-mining/testnet

2
src/pages/references/bns-contract.md

@ -8,7 +8,7 @@ import { BNSErrorcodeReference, BNSFunctionReference } from '@components/bns-ref
## Introduction
The [Blockchain Naming System (BNS)](/technology/naming-system) is implemented as a smart contract using Clarity.
The [Blockchain Naming System (BNS)](/understand-stacks/protocols/bns) is implemented as a smart contract using Clarity.
Below is a list of public and read-only functions as well as error codes that can be returned by those methods.

2
src/pages/references/faqs.md

@ -13,7 +13,7 @@ Developers, get started building apps and contracts on the [developer page at st
## Stacks Network
Learn more about the network behind the user-owned internet on Bitcoin in the [Understand Stacks chapter](https://docs.blockstack.org/understand-stacks/overview) in the docs.
Learn more about the network behind the user-owned internet on Bitcoin in the [Understand Stacks chapter](/understand-stacks/overview) in the docs.
## Stacks Token

46
src/pages/references/stacks-cli.md

@ -1,46 +0,0 @@
---
title: Stacks CLI
description: Interacting with the Stacks 2.0 Blockchain via CLI
---
export { convertBlockstackCLIUsageToMdx as getStaticProps } from '@common/data/cli-ref'
import { CLIReferenceTable } from '@components/cli-reference'
## Introduction
The command line is intended for developers only. Developers can use the command
line to interact with the Stacks Blockchain as well as test and debug Stacks applications
- Make token transfer transactions
- Deploy smart contracts
- Call smart contract functions
- Generate new keychains
- Convert between testnet, mainnet keys
- Load, store, and list data in Gaia hubs
!> Many of the commands operate on unencrypted private keys. For this reason, **DO NOT** use this tool for day-to-day tasks as you risk the security of your keys.
You must install the command line before you can use the commands.
## How to install the command line
You must have [Node.js](https://nodejs.org/en/download/) v12 or higher (v14 recommended). macOS and Linux users can avoid `sudo` or [permissions problems](https://docs.npmjs.com/resolving-eacces-permissions-errors-when-installing-packages-globally) or by using [`nvm`](https://github.com/nvm-sh/nvm). These instructions assume you are using a macOS or Linux system.
To install the command line, do the following:
```bash
npm install -g @stacks/cli
```
### Troubleshooting the CLI installation
If you run into `EACCES` permissions errors, try the following:
- See https://docs.npmjs.com/resolving-eacces-permissions-errors-when-installing-packages-globally.
- Use [`Node Version Manager`](https://github.com/nvm-sh/nvm).
## List of commands
To see the usage and options for the command in general, enter `stx` without any subcommands. To see a list of subcommands enter `stx help`. Enter `stx SUBCOMMAND_NAME help` to see a subcommand with its usage. The following are the available subcommands:
<CLIReferenceTable mdx={props.mdx} />

5
src/pages/start-mining/mainnet.md

@ -13,10 +13,7 @@ images:
## Introduction
Make sure you've followed the [Running mainnet node](/understand-stacks/running-mainnet-node) tutorial. Once completed it's only a few more steps to run a proof-of-burn miner on the mainnet.
[@page-reference | inline]
| /understand-stacks/running-mainnet-node
Make sure you've followed the [Running mainnet node](/understand-stacks/running-mainnet-node) procedure. Once completed it's only a few more steps to run a proof-of-burn miner on the mainnet.
If you're interested in mining on the testnet, you can find instructions on how to do that here:

5
src/pages/start-mining/testnet.md

@ -13,10 +13,7 @@ images:
## Introduction
Make sure you've followed the [Running testnet node](/understand-stacks/running-testnet-node) tutorial. Once completed it's only a few more steps to run a proof-of-burn miner on the testnet.
[@page-reference | inline]
| /understand-stacks/running-testnet-node
Make sure you've followed the [Running testnet node](/understand-stacks/running-testnet-node) procedure. Once completed it's only a few more steps to run a proof-of-burn miner on the testnet.
If you want to learn more about the technical details of mining, please review the mining guide:

161
src/pages/storage-hubs/amazon-ec2-deploy.md

@ -1,161 +0,0 @@
---
title: Deploying Gaia Hub on Amazon EC2
description: Use a template to deploy a Gaia hub on Amazon EC2
---
## Introduction
The template provided on this page provides an easy way to deploy a Gaia hub directly to Amazon EC2. You can use this
template to deploy your own Gaia hub to your Amazon Web Services (AWS) account. Amazon EC2 is an affordable and
convenient cloud computing provider. The template provides a one-click deploy for Amazon EC2 with either S3 or EBS as a
storage provider.
## Prerequisites
This procedure uses Amazon CloudFormation to configure an EC2 cloud compute provider to run the Gaia hub service with
an S3 or EBS provider for file storage. You should have access to an AWS account either through your personal account or
through a corporate account. This account should have permissions to create resources.
Additionally, you must also own a domain name and be able to update the DNS records associated with that domain name.
The procedure on this page uses a free domain created on [freenom][], generically the procedure used
is similar on other domain name providers.
## Launching the template
Use a link in the table to launch the CloudFormation template in the AWS region that you wish to deploy a Gaia hub.
| Template name | Description | Launch |
| ------------------------ | -------------------------------------------------------------------- | -------------------------------------------------------------------- |
| Gaia Hub EC2 (us-east-1) | Deploys a Gaia hub service on EC2 with an S3 or EBS storage provider | [![](/images/cloudformation-launch-stack-button.png)][ec2-us-east-1] |
## Task 1: Configure the CloudFormation template
Before launching your Gaia hub, you must configure the template with the appropriate values for your hub and the domain
it runs on.
1. Launch the template using the **Launch stack** button in the preceding table.
2. Review the **Prepare template** and **Template source** fields to ensure that the fields contain template values.
![CloudFormation specify template](/images/cloudformation-specify-template.png)
3. Click **Next**.
4. Specify configuration details for your Gaia hub:
i. Enter the **Stack name**. This name must be lowercase and unique within your AWS account.
![CloudFormation stack name](/images/cloudformation-stack-name.png)
ii. Enter the domain name on which the hub should run in the **DomainName** field. You must own this domain name and
be able to update the DNS records associated with it.
![CloudFormation domain name](/images/cloudformation-domain-name.png)
iii. Enter an email address associated with the Gaia hub in the **EmailAddress** field. This should be a valid email
that you have access to.
![CloudFormation email](/images/cloudformation-email.png)
iv. Enter the name of the S3 bucket to create for data storage in the **GaiaBucketName** field. The template combines
this name with the stack name to create a unique S3 bucket. The template ignores this field if **GaiaStorageType** is
set to `disk`.
![CloudFormation bucket name](/images/cloudformation-bucket.png)
v. Select the **GaiaStorageType** of which to use as a backend for the Gaia Hub. Selecting `s3` causes the template
to create an S3 bucket based on the name given in the previous step. Selecting `disk` causes the template to attach a
separate EBS volume to the EC2 instance for Hub storage.
![CloudFormation storage type](/images/cloudformation-storage-type.png)
vi. Select an available **InstanceType** from the drop-down. The default value is `t2.micro`.
vii. In the **KeyName** drop-down, select an [EC2 KeyPair](https://console.aws.amazon.com/ec2/v2/home?region=us-east-1#KeyPairs:)
to enable SSH access to the EC2 instance. You should download the `.pem` keyfile for this pair from the EC2 console.
For more information see the [EC2 key pair documentation](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html#prepare-key-pair).
![CloudFormation key name](/images/cloudformation-keyname.png)
viii. Leave the **SSHLocation** field with the default value of `0.0.0.0/0` to enable SSH access from any IP address.
If you wish to restrict SSH access to the EC2 instance to a certain IP, you can update this field.
ix. Select a _public_ subnet and its associated virtual private cloud (VPC) from the **SubnetId** and **VpcId** drop-downs to designate
where to deploy the Gaia hub instance.
![CloudFormation subnet and VPC](/images/cloudformation-subnet.png)
5. Click **Next**.
6. Configure stack options for your Gaia hub:
i. Enter any key-value pairs you wish to include as tags for your Gaia hub. These are optional and display-only, they
have no effect on the behavior of the CloudFormation stack.
ii. Select an IAM role for the Gaia hub. This is optional, if you don't specify am IAM role, the container runs
with the same permissions as your AWS account.
7. Click **Next**.
8. Review the configuration of your Gaia hub, and make any changes necessary.
9. At the bottom of the page, select the checkbox to acknowledge that AWS may create IAM resources with custom names.
![CloudFormation IAM resources](/images/cloudformation-iam-resources.png)
10. Click **Create stack** to launch your Gaia hub. The AWS console displays the summary of the stack.
## Task 2: Retrieve the public IP of your Gaia hub
Your stack can take several minutes to launch. You can monitor the **Events** tab of your hub to review the current
progress of the launch. When the launch is complete, the **Outputs** tab displays information about the hub. Select
the **PublicIP** and copy it to configure your domain name.
![CloudFormation outputs](/images/cloudformation-details.png)
## Task 3: Configure a domain name
Connect your domain to the Gaia hub EC2 instance by creating an `A` record for the domain DNS entry, and enter the
public IP from the previous step. In general, this procedure is similar depending on your hostname provider.
If you are using a free domain from [freenom], use the following procedure:
1. Log in to your freenom account.
2. Under **Services** > **My Domains**, click **Manage Domain** next to the domain corresponding to your Gaia hub.
![My freenom domains](/images/freenom-my-domains.png)
3. Click **Manage Freenom DNS** in the tab bar.
4. In the **Add Records** table, select `A` from the **Type** drop-down, then paste the public IP of your Gaia hub EC2
instance in the **Target** field.
5. Click **Save Changes** to update the DNS record.
-> It can take up to 15 minutes for your DNS record changes to propagate. In a terminal, use the command
`dig A +short <yourdomain.co>` to review if the changes have propagated. If the output of this command is the container
public IP, the changes were successful.
## Accessing your Gaia hub with SSH
To SSH into your Gaia hub EC2 host directly, you must have the keyfile used in container creation. Access the host with
the following command in your terminal:
```bash
ssh -i <your keyfile.pem> admin@<public_ip_address>
```
## Making changes to your Gaia hub
If you want to make changes to your Gaia hub, there are two options. You can delete the entire CloudFormation stack
using the **Delete** button in the CloudFormation dashboard. Once you have deleted the stack, you can re-create one and
modify your DNS to point at the new public IP.
![CloudFormation delete](/images/cloudformation-delete.png)
To modify the instance in place, navigate to the [EC2 instances console][] and type the instance name into the
**Filter instances** field. Select the instance from the search suggestion, then click the instance to select it.
On the **Tags** tab, you can click **Manage tags** to update the relevant key value pairs for the instance. If you make
changes to these tags, click **Save**, then select **Reboot instance** from the **Instance state** drop-down. Changes to
the tags are only applied to the instance at reboot.
![EC2 instances](/images/ec2-instances.png)
[ec2-us-east-1]: https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=gaia&templateURL=https://cf-templates-18jq0t04gve7c-us-east-1.s3.amazonaws.com/cloudformation.yaml
[freenom]: https://freenom.com
[ec2 instances console]: https://console.aws.amazon.com/ec2/v2/home?region=us-east-1#Instances

583
src/pages/storage-hubs/digital-ocean-deploy.md

@ -1,583 +0,0 @@
---
title: Deploy on DigitalOcean
description: Learn how to run a Gaia hub on DigitalOcean
---
## Introduction
This teaches you how to run a Gaia storage hub on DigitalOcean (DO). DigitalOcean is an affordable and convenient cloud computing provider. This example uses DigitalOcean Spaces for file storage. A space is equivalent to AWS's S3 file storage solution.
DigitalOcean provides you with a compute machines known as a **Droplets** and storage called a **Spaces**. You need both to run a Gaia hub. The Gaia hub setup you create here, requires get a Digital Droplet with Docker pre-installed and a 250 GB Space. Droplets and storage each run for \$5/month or a total of \$10/month.
<div class="uk-card uk-card-default uk-card-body">
<h5>Is this tutorial for you?</h5>
<p>This documentation is appropriate for advanced power users who are familiar with command line tools, editing configuration files, and basic configuration of services of DNS or Nginx.</p>
<p>If you are planning on running an <em>open-membership hub</em> or an <em>application-specific hub</em>, refer to the <a href="/storage-hubs/overview">Hub Overview</a></p>.
</div>
## Prerequisites you need
You use DigitalOcean choose and configure assets for running droplets and spaces. To enable this, you must be sure to complete the prerequisites in this section.
You must create an account on <a href="https://digitalocean.com" target="\_blank">DigitalOcean</a>. DigitalOcean requires you to supply a credit card to create an account. You are only charged for the services you use the Gaia hub as of this writing should cost \$10 USD a month.
The easiest way to interact with your droplet is the DigitalOcean Console. Users who are comfortable using the secure shell (SSH) and private keys may prefer to open a local terminal on their home machine instead. To enable this, you should ensure you have the following prerequisites completed.
- Locate an existing SSH key pair on your Mac or <a href="https://help.dreamhost.com/hc/en-us/articles/115001736671-Creating-a-new-Key-pair-in-Mac-OS-X-or-Linux" target="\_blank">create a new SSH key pair</a>. Your key should have a passphrase, do not use a key pair without one.
- Add the <a href="https://www.digitalocean.com/docs/droplets/how-to/add-ssh-keys/to-account/" target="\_blank">SSH from your local machine to DigitalOcean</a>.
- Create a <a href="https://www.digitalocean.com/docs/api/create-personal-access-token/" target="\_blank">personal access token in DigitalOcean</a>.
- Install `doctl` which is the DigitalOcean command line tool. For information on installing these, see which is the DigitalOcean command line utility. Check out their [installation instructions](https://github.com/digitalocean/doctl/blob/master/README.md#installing-doctl) to see how to install it on your computer.
## Task 1: Create a DigitalOcean space
In this task you create a **Space** which is where Gaia stores your files.
1. Choose **Create > Spaces** from the dashboard menu.
![Dashboard](/images/create-space.png)
2. **Choose a datacenter region** section.
~> Choose a region that is both geographically close to you and that supports spaces. Currently, <strong>San Francisco</strong>, <strong>New York</strong>, <strong>Amsterdam</strong>, and <strong>Singapore</strong> support spaces.
The geographical location of your server impacts latency for storing data.
You select a region close to you so that when you use Stacks apps,
storing data is quicker.
3. Scroll down to **Finalize and Create**.
4. **Choose a unique name**.
This name is used when reading files that you've stored through Gaia. You'll need to remember this name when you set up your Gaia server later on.
5. Click **Create a Space**.
After a moment, your Space is up and running.
## Task 2: Enable File Listing and Set a Bucket Policy
On Digital Ocean, set **Enable File Listing**:
1. Navigate to the **Spaces** tab.
2. Select your newly created space and click **Settings**
3. Set **Enable File Listing** for your space.
4. Press **Save**.
On your local workstation, create a bucket policy to grant read permission on your space.
1. On your local workstation, open a terminal.
2. <a href="https://www.digitalocean.com/docs/spaces/resources/s3cmd/" target="_blank">Install and configure the <strong>s3cmd</strong></a>.
3. In the current directory, use the `touch` command to create a file called `gaiahub-policy`.
```bash
touch gaiahub-policy
```
4. Use your favorite editor to open the file.
5. Add the following policy to the file.
```json
{
"Version": "2012-10-17",
"Id": "read policy",
"Statement": [
{
"Sid": "PublicRead",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<SPACE_NAME>/*"
}
]
}
```
6. Edit the `Resource` line and replace the `<SPACE_NAME>` with your space name from Digital Ocean.
For example, if your space is named `meepers`, after editing the line you would have:
```yaml
'Resource': 'arn:aws:s3:::meepers/*'
```
Be sure not to change any of the other fields, especially `Version`.
7. Save and close the file.
8. Use `s3cmd` to enact the policy.
```bash
s3cmd setpolicy gaiahub-policy s3://<SPACE_NAME>
```
Be sure to `SPACE_NAME` with the name of your space, for example:
```bash
s3cmd setpolicy gaiahub-policy s3://meepers
```
## Task 3: Set a CORS configuration
1. On your local workstation, create a file called `gaiahub-cors.xml` that looks like this:
```xml
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>HEAD</AllowedMethod>
<AllowedOrigin>*</AllowedOrigin>
<ExposeHeader>ETag</ExposeHeader>
<MaxAgeSeconds>0</MaxAgeSeconds>
</CORSRule>
</CORSConfiguration>
```
2. Use `s3cmd` to enact the configuration.
```bash
s3cmd setcors gaiahub-cors.xml s3://<SPACE_NAME>
```
## Task 4: Create a DigitalOcean droplet
In this task, you add a droplet to your account. The droplet is a droplet is a cloud-based server you can use as a compute resource. This server is where you will run the Gaia Storage System service. The droplet you create will be an Ubuntu server with Docker pre-installed.
1. Log into DigitalOcean.
2. Go to your DigitalOcean dashboard.
![Dashboard](/images/digital-welcome.png)
3. Click the **Create > Droplets** button in the top right.
![Create option](/images/digital-droplet.png)
4. Select the **Marketplace** tab.
![Marketplace](/images/digital-one-click.png)
5. Select the **Docker** app from the options presented.
6. Scroll down to the **Choose a size** section and use the left arrow to display and select the **\$5/mo** image.
This size gives you plenty of memory and disk space to run a personal hub.
7. Scroll down to the **Choose a datacenter region** section.
~> Choose a region that is both geographically close to you and that supports spaces. Currently, <strong>San Francisco</strong>, <strong>New York</strong>, <strong>Amsterdam</strong>, and <strong>Singapore</strong> support spaces.
The geographical location of your server impacts latency for storing data. You select a region close to you so that when you use Stacks apps, storing data is quicker.
8. If you are using SSH, scroll to the **Add your SSH key** section and choose an SSH key to use. Otherwise,
9. Scroll down to the **Finalize and create** section.
10. **Choose a hostname** for your droplet such as `moxie-gaiahub.`
11. Review your choices then click **Create** to start your droplet running.
At this point, your new droplet should appear in the list of resources on your DigitalOcean dashboard.
## Task 5: Open a console on your Droplet
A droplet console emulates the access you would have if you were sitting down with a keyboard and monitor attached to the actual server. In this section, you open a console on your droplet.
-> If you are an SSH user and have completed the prerequisites, <strong>skip this section</strong>. Instead, use [the DigitalOcean instructions for connecting with doctl](https://do.co/2S4HMk1).
1. From the DigitalOcean dashboard, select Droplets.
You should see the droplet you just created.
2. Click on the droplet name to open the control panel.
![Droplet control panel](/images/droplet-control.png)
3. Choose **Access** from the control panel.
4. Select **Reset Root Password** to have DigitalOcean send you the root password.
DigitalOcean sends a temporary password to the email attached to your account. It takes a couple of minutes to reset the root password on your droplet.
5. Open your email and copy the password.
6. Switch back to the droplet control panel and choose **Launch Console**.
A new window with the console appears.
7. Enter `root` for the login.
8. Paste the password copied from your email.
The system displays a message telling you to change the `root` password.
![Droplet control panel](/images/droplet-control.png)
And prompts you for the current password.
9. Past the password copied from your email again.
The system prompts you for a new password and ask you to enter it again.
10. Provide and confirm a new password.
The system logins you in and gives you a welcome message. At the conclusion of the message, you are at the console prompt.
```bash
Welcome to DigitalOcean's One-Click Docker Droplet.
To keep this Droplet secure, the UFW firewall is enabled.
All ports are BLOCKED except 22 (SSH), 2375 (Docker) and 2376 (Docker).
* The Docker One-Click Quickstart guide is available at:https ://do.co/docker1804#start
* You can SSH to this Droplet in a terminal as root: ssh root@138.68.28.100
* Docker is installed and configured per Docker's recommendations:https://docs.docker.com/install/linux/docker-ce/ubuntu/
* Docker Compose is installed and configured per Docker's recommendations:https://docs.docker.eom/compose/install/#install-compose
For help and more information, visit http://do.co/dockerl804
To delete this message of the day: rm -rf /etc/update-motd.d/99-one-clickroot@meepers:~#
```
<div class="uk-card uk-card-default uk-card-body">
<h5>Useful tips for the console</h5>
<p>If you run into problems using the console, see <a href="https://www.digitalocean.com/docs/droplets/resources/console/" target="\_blank">the notes on this page in the DigitalOcean documentation</a>.</p>
<p>If you find the output from ls difficult to read, try enter the following to change the console colors from the command line: <code>LS_COLORS="di=1&semi;31"</code> You can also edit your console <code>.bashrc</code>. file permanently, of course.</p>
</div>
## Task 6: Create a space key
1. In the DigitalOcean dashboard, go to the **API** page.
2. Scroll to the **Spaces Access Keys** section.
3. Click **Generate New Key**.
The system prompts you to give the key a name.
4. Enter a name for the key.
It is helpful to choose a descriptive name, like `gai-hub-key`.
5. Press the check mark.
The system creates your key and displays both the key and its secret.
![Access key](/images/space-access-key.png)
6. Save your secret in a secure password manager.
You should never share your secret.
7. Leave the page up with your key and secret and go to your open console.
## Task 7: Get the Gaia code and configure your server
You should have the console open as `root` on your Droplet. In this section, you get the Gaia code and configure the Gaia service.
1. Copy the Gaia code into your droplet using the `git clone` command.
```bash
root@meepers:~# git clone https://github.com/blockstack/gaia.git
```
Successful output from this command looks like the following.
```bash
Cloning into 'gaia'...
remote: Enumerating objects: 63, done.
remote: Counting objects: 100% (63/63), done.
remote: Compressing objects: 100% (46/46), done.
remote: Total 4206 (delta 27), reused 35 (delta 17), pack-reused 4143
Receiving objects: 100% (4206/4206), 17.40 MiB | 9.89 MiB/s, done.
Resolving deltas: 100% (2700/2700), done.
root@meepers:~#
```
This command creates a `gaia` subdirectory.
2. Change to `hub` directory in the `gaia` code.
```bash
cd gaia/hub
```
3. Copy the configuration sample to a new `config.json` file.
```bash
cp config.do.sample.json config.json
```
4. Edit your new `config.json` file with `vi` or `vim`.
```bash
vi config.json
```
You now need to edit this JSON file to have it store files on your DigitalOcean space.
```json
{
"serverName": "DROPLET_NAME",
"port": 4000,
"driver": "aws",
"readURL": "SPACE_URL",
"proofsConfig": {
"proofsRequired": 0
},
"pageSize": 20,
"bucket": "SPACE_NAME",
"awsCredentials": {
"endpoint": "SPACE_LOCATION",
"accessKeyId": "YOUR_ACCESS_KEY",
"secretAccessKey": ""
},
"argsTransport": {
"level": "debug",
"handleExceptions": true,
"stringify": true,
"timestamp": true,
"colorize": false,
"json": true
}
}
```
You'll find that the `driver` is set to `aws`. The DigitalOcean space API exactly mimics the S3 API. Since Gaia doesn't have a DigitalOcean driver, you can just use the `aws` driver with some special configuration.
5. Set the `serverName` to the droplet you just created.
6. Set the `readURL` to the URL of the DigitalOcean space you just created.
If your space URL called `https://meepers-hub-space.sfo2.digitaloceanspaces.com,` the `readURL` name is `https://meepers-hub-space.sfo2.digitaloceanspaces.com.` 7. Set the `bucket` to the name of the DigitalOcean space you just created.
If your space is called `meepers-hub-space,` the `bucket` value is `meepers-hub-space`.
8. Go back to your DigitalOcean dashboard open to your space key.
9. Copy the **Key** and paste it into the `accessKeyId` value in the `config.json` file.
10. Copy the **Secret** and paste it into the `secretAccessKey` value in the `config.json` file.
11. In the DigitalOcean dashboard, choose the Spaces page.
12. Copy the section of your space URL that follows the name.
![Space endpoint](/images/space-endpoint.png)
In this example, you would copy the `sfo2.digitaloceanspaces.com` section.
13. Paste the string you copied into the `endpoint` value.
14. Ensure the `proofsRequired` value is set to the number `0` (zero).
This will allow Stacks user to write to your Gaia hub, without any social proofs required. You can change this later on, and do other things to lock-down this Gaia hub to just yourself, but that is outside the scope of this document.
At this point, the `json.config` file should be completed and appear similar to the following, but with your values.
```json
{
"serverName": "moxie-gaiahub",
"port": 4000,
"driver": "aws",
"readURL": "https://meepers-hub-space.sfo2.digitaloceanspaces.com",
"proofsConfig": {
"proofsRequired": 0
},
"pageSize": 20,
"bucket": "meepers-hub-space",
"awsCredentials": {
"endpoint": "sfo2.digitaloceanspaces.com",
"accessKeyId": "fb3J7AT/PGMGMPOA86EFLpx8IjGZQib99eXWjVR+QK0",
"secretAccessKey": "9ac685342eaa5bc4b44c13f3ecf43b001a3bdb9e2257114d44394d410dd91f66"
},
"argsTransport": {
"level": "debug",
"handleExceptions": true,
"stringify": true,
"timestamp": true,
"colorize": false,
"json": true
}
}
```
1. Save your config file and close the `vim` editor.
The system returns you back to the prompt.
## Task 8: Run the Gaia image with Docker
While your console is still in the the `gaia/hub` folder, build the `gaia.hub` image.
1. Enter the following `docker` command at the console command line.
```bash
docker build -t gaia.hub .
```
This build users the `Dockerfile` already in the `gaia/hub` folder. The output of the command is similar to the following:
```bash
....
npm WARN gaia-hub@2.3.4 No license field.
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: fsevents@1.2.4 (node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for fsevents@1.2.4: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})
added 877 packages from 540 contributors and audited 3671 packages in 38.122s
found 0 vulnerabilities
Removing intermediate container b0aef024879f
---> 5fd126019708
Step 5/5 : CMD ["npm", "run", "start"]
---> Running in ae459cc0865b
Removing intermediate container ae459cc0865b
---> b1ced6c39784
Successfully built b1ced6c39784
Successfully tagged gaia.hub:latest
```
2. Run your Gaia hub image.
```bash
docker run --restart=always -v ~/gaia/hub/config.json:/src/hub/config.json -p 3000:3000 -e CONFIG_PATH=/src/hub/config.json gaia.hub
```
This runs your Gaia hub on port `3000`. If everything runs successfully, the last line outputted from this command should be:
```bash
Successfully compiled 13 files with Babel.
{"level":"warn","message":"Listening on port 3000 in development mode","timestamp":"2019-01-23T16:35:05.216Z"}
```
3. If your command did run successfully, stop the service using the hotkey `ctrl-c`.
4. Run the the image again with this new command.
```bash
docker run --restart=always -v ~/gaia/hub/config.json:/src/hub/config.json -p 3000:3000 -e CONFIG_PATH=/src/hub/config.json -d gaia.hub
```
This command includes `-d` option to `docker run`. This runs Docker in detached mode, so that it runs in the background. You can run `docker ps` to see your running docker images, and get the `id` of your Gaia server.
```bash
root@meepers:~/gaia/hub# docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
aeca7eea4a86 gaia.hub "npm run start" 11 seconds ago Up 10 seconds 0.0.0.0:3000->3000/tcp musing_payne
```
At this point, your Gaia service is up and running. You can run `docker logs MY_CONTAINER_ID` with your running image's ID to see the logs of this server at any time.
## Task 9: Set up an Nginx reverse proxy
In this task, you set up a simple Nginx reverse proxy to serve your Docker container through a public URL. You do this from the droplet console command line.
1. Install nginx into the droplet.
```bash
sudo apt-get install nginx
```
2. Enter `y` to confirm the installation.
3. Edit the nginx default configuration file.
```bash
vi /etc/nginx/sites-available/default
```
4. Inside the `location /` block (line 48), enter the following configuration:
```nginx
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
if ($request_method = 'OPTIONS') {
more_set_headers 'Access-Control-Allow-Origin: *';
more_set_headers 'Access-Control-Allow-Methods: GET, POST, OPTIONS, DELETE';
more_set_headers 'Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, authorization, If-Match';
more_set_headers 'Access-Control-Max-Age: 21600';
more_set_headers 'Content-Type: text/plain charset=UTF-8';
more_set_headers 'Content-Length: 0';
return 204;
}
more_set_headers 'Access-Control-Allow-Origin: *';
}
```
This simple configuration passes all requests through to your Gaia hub running at port `3000`.
5. Save and close the file.
6. Run `nginx -t` to make sure you have no syntax errors.
```bash
root@meepers:~/gaia/hub# nginx -t
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful
```
7. Restart `nginx` with your new configuration.
```bash
systemctl restart nginx
```
8. Allow access to your Gaia hub by exposing port 80 to the public.
```bash
ufw allow 80
```
## Task 10: Test your Gaia server
Now, you are ready to test your Gaia server and make sure it is up and running.
1. Click on **Droplets** in the sidebar.
2. Find your Droplet running Gaia.
![Droplet IP](/images/space-endpoint.png)
3. Copy the IP address for it.
4. In your browser, visit the page `MY_DROPLET_IP/hub_info`.
You should see a response from your Gaia hub.
![Hub test](/images/hub-running.png)
The `read_url_prefix` should be combine from the bucket and endpoint create
in your `config.json` file, for example,
`https://meepers-hub-space.s3.amazonaws.com/.`
## Task 11: Configure a domain name
At this point, you can point a domain to your Gaia hub. Although it's not required, it is highly recommended. If you use a domain, you can migrate your Droplet to a different server (or even provider such as Azure or AWS) at any time, and still access it through the domain URL. Simply point your domain at the IP address for your Droplet. Use an `A Record` DNS type.
These instructions assume you have already created a free <a href="https://www.freenom.com" target="\_blank">domain through the freenom service</a>. To point this freenom domain to your Gaia hub, do the following:
1. Log into your freenom account.
2. Choose the **Manage Freenom Domain** tab.
3. Add an **A** record leave the **Name** field blank.
This record points your entire domain to the hub IP.
4. Save your changes.
5. Create a CNAME record.
For example, you can use the prefix `www` with your domain name. When you are done, your
6. Save your changes.
At this point, your DNS management should look similar to the following except that with your domain rather than the `maryhub.ga` domain.
![DNS fields](/images/dns-fields.png)
7. After your changes propagate, visit your new domain at the `hub_info` page.
![Domain test](/images/domain-test.png)
## Task 12: Set up SSL
If you've configured a domain to point to your Gaia hub, then it's highly
recommended that you set up SSL to connect to your hub securely. DigitalOcean
provides <a
href="https://www.digitalocean.com/community/tutorials/how-to-secure-nginx-with-let-s-encrypt-on-ubuntu-18-04"
target="\_blank">how to setup SSL </a>. Follow those instructions, to setup SSL.
When completed, you'll be able to visit `https://mygaiadomain.com/hub_info`
securely.

497
src/pages/storage-hubs/gaia-admin.md

@ -1,497 +0,0 @@
---
title: Gaia admin
description: 'Storing user data with Stacks'
---
## Introduction
A Gaia service can run a simple administrative service co-located with your Gaia hub. This service allows you to
administer the Gaia hub with the help of an API key. Gaia hubs installed using the Gaia Amazon Machine Image (AMI) have
this service integrated automatically.
In this section, you learn how to use the Gaia administrator service with your Gaia hub.
-> The examples in this section assume that you installed the Gaia service using the
[Deploy on Amazon EC2](/storage-hubs/amazon-ec2-deploy) tutorial.
## Understand the configuration files
The administrator service relies on two configuration files, the hub's configuration and the configuration of the
administrator service itself. You can find the hub's configration in `/tmp/hub-config/config.json` in the
`docker_admin_1` container. Your EC2 instance has the administrator service configuration in the
`/gaia/docker/admin-config/config.json` file.
The administrator service requires the following information:
- The location of the Gaia hub configuration file
- which API keys to use when authenticating administrative requests
- which commands to run to restart the Gaia hub on a config change
The following is the standard administrator service config installed with your EC2 instance.
```json
{
"argsTransport": {
"level": "debug",
"handleExceptions": true,
"timestamp": true,
"stringify": true,
"colorize": true,
"json": true
},
"port": 8009,
"apiKeys": ["hello"],
"gaiaSettings": {
"configPath": "/tmp/hub-config/config.json"
},
"reloadSettings": {
"command": "/bin/sh",
"argv": ["-c", "docker restart gaia_hub_1 &"],
"env": {},
"setuid": 1000,
"setgid": 1000
}
}
```
The `port` is the port where Gaia is running. The `apiKeys` field is key used for making calls to the hub. The
`gaiaSettings` field specifies the location of the Gaia hub configuration file.
The `argsTransport` section configures the hub logging. The service uses the `winston` logging service. Refer to their
documentation for full details on the [logging configuration options](https://github.com/winstonjs/winston).
| Field | Description |
| --------------- | ------------------------------------------------------------------------ |
| level | Lowest level this transport logs (default: `info`) |
| handleException | Set to true to have this transport handle exceptions. (default: `false`) |
| timestamp | The timestamp when of the message |
| stringify | Converts the output to a JSON string. |
| colorize | Colorizes the standard logging level |
| json | Log format. |
The `reloadSettings` configure the command that is used to reload your Gaia hub.
| Field | Description |
| ------- | ------------------------------------------------------------------------------------------------------------------- |
| command | A command which reloads the Gaia hub service. |
| argv | An array containing the command arguments. |
| env | This is a key/value list of any environment variables that need to be set for the command to run. This is optional. |
| setuid | This is the UID under which the command runs. This is optional. |
| setgid | This is the GID under which the command runs. This is optional. |
-> Review the [JSON config schema](#optional-understand-the-configuration-file-schema) for a list of all
available parameters and possible values.
## Using the administrator service APIs
You use the administrator service APIs to manage the hub. Administrating a hub requires that you make calls from a
terminal on your hub service server. To execute administrator functions on a Gaia hub created with AWS, you `ssh` into
your instance as follows:
```bash
ssh -i <your keyfile.pem> admin@<public_ip_address>
```
You must also set the `API_KEY` in your environment:
```bash
export API_KEY="<API_KEY>"
```
You may find it useful to install a JSON processor such as `jq` to process the
output of the administrator commands.
### Restart the Gaia Hub (`POST /v1/admin/reload`)
The administrator service can make changes to the Gaia hub's config file, but the changes only take effect when the Gaia
hub reboots. You can do this as follows:
```bash
export API_KEY="hello"
curl -H "Authorization: bearer $API_KEY" -X POST http://localhost:8009/v1/admin/reload
```
```json
{ "result": "OK" }
```
When you `POST` to this endpoint, the administrator service runs the command described in the `reloadSettings` section
of the config file. It attempts to spawn a subprocess from the given `reloadSettings.command` binary, and pass it the
arguments given in `reloadSettings.argv`. Note that the subprocess doesn't run in a user-accesssible shell.
#### Errors
If you don't supply a valid API key, this method fails with HTTP 403.
This endpoint returns HTTP 500 if the reload command fails. If this happens, the return value contains the command's
exit code and the signal that killed.
### Get the hub configuration (`GET /v1/admin/config`)
This endpoint can to read and write a Gaia hub's non-driver-related settings. These include the port it listens on, and
its proof-checking
settings.
To read the Gaia hub settings, you would run the following:
```bash
export API_KEY="hello"
curl -H "Authorization: bearer $API_KEY" http://localhost:8009/v1/admin/config
```
```json
{ "config": { "port": 4000, "proofsConfig": { "proofsRequired": 0 } } }
```
### Set the hub configuration (`POST /v1/admin/config`)
To set Gaia hub settings, `POST` the changed JSON fields to this endpoint.
```bash
export API_KEY="hello"
curl -H "Authorization: bearer $API_KEY" -H 'Content-Type: application/json' -X POST --data-raw '{"port": 3001}' http://localhost:8009/v1/admin/config
```
```json
{ "message": "Config updated -- you should reload your Gaia hub now." }
```
If the settings were successfully applied, the method returns a message to reload your Gaia hub. You can set multiple
drivers' settings with a single call. For example, you can set:
- The driver to use (`driver`)
- The Gaia's read URL endpoint (`readURL`)
- The number of items to return when listing files (`pageSize`)
- The driver-specific settings
The data accepted on `POST` must contain a valid Hub configuration, for example:
```
const GAIA_CONFIG_SCHEMA = {
type: "object",
properties: {
validHubUrls: {
type: "array",
items: { type: "string", pattern: "^http://.+|https://.+$" },
},
requireCorrectHubUrl: { type: "boolean" },
serverName: { type: "string", pattern: ".+" },
port: { type: "integer", minimum: 1024, maximum: 65534 },
proofsConfig: { type: "integer", minimum: 0 },
whitelist: {
type: "array",
items: {
type: "string",
pattern: "^[123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz]+$"
}
},
driver: { type: "string", pattern: ".+" },
readURL: { type: "string", pattern: "^http://.+$|https://.+$" },
pageSize: { type: "integer", minimum: 1 },
bucket: { type: "string", pattern: ".+" },
cacheControl: { type: "string", pattern: ".+" },
azCredentials: {
accountName: { type: "string", pattern: ".+" },
accountKey: { type: "string", pattern: ".+" },
},
diskSettings: {
storageRootDirectory: { type: "string" }
},
gcCredentials: {
email: { type: "string" },
projectId: { type: "string" },
keyFilename: { type: "string" },
credentials: {
type: "object",
properties: {
client_email: { type: "string" },
private_key: { type: "string" }
}
},
},
awsCredentials: {
assessKeyId: { type: "string" },
secretAccessKey: { type: "string" },
sessionToken: { type: "string" }
}
}
}
```
When performing a `GET` within a `config` object the return value contains the same fields.
#### Errors
If you don't supply a valid API key, both the `GET` and `POST` method fail with HTTP 403.
In general, you should only set the relevent Gaia hub config fields. If you `POST` invalid settings values, you get an
HTTP 400 error.
## Example: Read and write driver settings
Use the `/v1/admin/config` endpoint to read and write storage driver settings. To get the current driver settings, run
the following commands in a terminal:
```bash
export API_KEY="hello"
curl -H "Authorization: bearer $API_KEY" http://localhost:8009/v1/admin/config
```
```json
{
"config": {
"driver": "disk",
"readURL": "http://localhost:4001/",
"pageSize": 20,
"diskSettings": { "storageRootDirectory": "/tmp/gaia-disk" }
}
}
```
To update the driver settings, run the following commands in a terminal:
```bash
export API_KEY="hello"
export AWS_ACCESS_KEY="<hidden>"
export AWS_SECRET_KEY="<hidden>"
curl -H "Authorization: bearer $API_KEY" -H 'Content-Type: application/json' -X POST --data-raw "{\"driver\": \"aws\", \"awsCredentials\": {\"accessKeyId\": \"$AWS_ACCESS_KEY\", \"secretAccessKey\": \"$AWS_SECRET_KEY\"}}" http://localhost:8009/v1/admin/config
```
```json
{ "message": "Config updated -- you should reload your Gaia hub now." }
```
## Example: Read and write the whitelist
This endpoint lets you read and write the `whitelist` section of a Gaia hub, to control who can write to it and list its files.
To get the current whitelist, run the following commands in a terminal:
```bash
export API_KEY="hello"
curl -H "Authorization: bearer $API_KEY" http://localhost:8009/v1/admin/config
```
```json
{ "config": { "whitelist": ["15hUKXg1URbQsmaEHKFV2vP9kCeCsT8gUu"] } }
```
To set the whitelist, you must set the _entire_ whitelist. To set the list, run the following command in a terminal:
```bash
export API_KEY="hello"
curl -H "Authorization: bearer $API_KEY" -H 'Content-Type: application/json' -X POST --data-raw '{"whitelist": ["1KDcaHsYJqD7pwHtpDn6sujCVQCY2e1ktw", "15hUKXg1URbQsmaEHKFV2vP9kCeCsT8gUu"]}' http://localhost:8009/v1/admin/config
```
```json
{ "message": "Config updated -- you should reload your Gaia hub now." }
```
## View logs for the hub or administrator service
The Docker container for each Gaia service contain the logs for that service. To view the log for a particular service,
use the `docker logs` command. For example, to get the logs for the hub:
```bash
docker logs docker_hub_1
```
```bash
> gaia-hub@2.3.4 start /src/hub
> npm run build && node lib/index.js
> gaia-hub@2.3.4 build /src/hub
> npm run lint && babel src -d lib && chmod +x lib/index.js
> gaia-hub@2.3.4 lint /src/hub
> eslint src
Successfully compiled 13 files with Babel.
{"level":"warn","message":"Listening on port 3000 in development mode","timestamp":"2019-02-14T04:00:06.071Z"}
```
## Optional: Understand the configuration file schema
The following JSON schema details the possible parameters for a hub configuration:
```json
{
"$schema": "http://json-schema.org/draft-07/schema#",
"additionalProperties": false,
"properties": {
"argsTransport": {
"additionalProperties": false,
"properties": {
"colorize": {
"default": true,
"type": "boolean"
},
"handleExceptions": {
"default": true,
"type": "boolean"
},
"json": {
"default": false,
"type": "boolean"
},
"level": {
"default": "warn",
"enum": ["debug", "error", "info", "verbose", "warn"],
"type": "string"
},
"timestamp": {
"default": true,
"type": "boolean"
}
},
"type": "object"
},
"authTimestampCacheSize": {
"default": 50000,
"type": "integer"
},
"awsCredentials": {
"additionalProperties": false,
"description": "Required if `driver` is `aws`",
"properties": {
"accessKeyId": {
"type": "string"
},
"endpoint": {
"type": "string"
},
"secretAccessKey": {
"type": "string"
},
"sessionToken": {
"type": "string"
}
},
"type": "object"
},
"azCredentials": {
"additionalProperties": false,
"description": "Required if `driver` is `azure`",
"properties": {
"accountKey": {
"type": "string"
},
"accountName": {
"type": "string"
}
},
"type": "object"
},
"bucket": {
"default": "hub",
"type": "string"
},
"cacheControl": {
"default": "public, max-age=1",
"type": "string"
},
"diskSettings": {
"additionalProperties": false,
"description": "Required if `driver` is `disk`",
"properties": {
"storageRootDirectory": {
"type": "string"
}
},
"type": "object"
},
"driver": {
"enum": ["aws", "azure", "disk", "google-cloud"],
"type": "string"
},
"gcCredentials": {
"additionalProperties": false,
"description": "Required if `driver` is `google-cloud`",
"properties": {
"credentials": {
"additionalProperties": false,
"properties": {
"client_email": {
"type": "string"
},
"private_key": {
"type": "string"
}
},
"type": "object"
},
"email": {
"type": "string"
},
"keyFilename": {
"type": "string"
},
"projectId": {
"type": "string"
}
},
"type": "object"
},
"maxFileUploadSize": {
"default": 20,
"description": "The maximum allowed POST body size in megabytes. \nThe content-size header is checked, and the POST body stream \nis monitoring while streaming from the client. \n[Recommended] Minimum 100KB (or approximately 0.1 MB)",
"minimum": 0.1,
"type": "number"
},
"pageSize": {
"default": 100,
"maximum": 4096,
"minimum": 1,
"type": "integer"
},
"port": {
"default": 3000,
"maximum": 65535,
"minimum": 0,
"type": "integer"
},
"proofsConfig": {
"additionalProperties": false,
"properties": {
"proofsRequired": {
"default": 0,
"type": "integer"
}
},
"type": "object"
},
"readURL": {
"type": "string"
},
"requireCorrectHubUrl": {
"default": false,
"type": "boolean"
},
"serverName": {
"default": "gaia-0",
"description": "Domain name used for auth/signing challenges. \nIf `requireCorrectHubUrl` is true then this must match the hub url in an auth payload.",
"type": "string"
},
"validHubUrls": {
"description": "If `requireCorrectHubUrl` is true then the hub specified in an auth payload can also be\ncontained within in array.",
"items": {
"type": "string"
},
"type": "array"
},
"whitelist": {
"description": "List of ID addresses allowed to use this hub. Specifying this makes the hub private \nand only accessible to the specified addresses. Leaving this unspecified makes the hub \npublicly usable by any ID.",
"items": {
"type": "string"
},
"type": "array"
}
},
"required": ["driver", "port"],
"type": "object"
}
```
-> A full list of examples are in [the Gaia repository on GitHub](https://github.com/blockstack/gaia/tree/master/hub)

16
src/pages/storage-hubs/overview.md

@ -1,16 +0,0 @@
---
title: Storage hubs overview
description: Securely store application and user data off-chain
---
## Introduction
The Gaia storage system allows you to store private application data off the blockchain and still access it securely
with Stacks applications. Where possible, applications should only store critical transactional metadata directly to
the Stacks blockchain, while keeping application and user data in the Gaia storage system. For more information about
the Gaia storage system, see the [Gaia protocol reference](/build-apps/references/gaia).
A [Gaia hub](/build-apps/references/gaia#user-control-or-how-is-gaia-decentralized) consists of a service and a storage
resource, generally hosted on the same cloud compute provider. The hub service requires an authentication token from a
storage requestor, and writes key-value pairs to the associated storage resource. Storage requestors can choose a Gaia
hub provider. This documentation provides an overview of how to set up and operate a Gaia hub.

11
src/pages/understand-stacks/accounts.md

@ -13,14 +13,11 @@ Stacks 2.0 accounts are entities that own assets, like Stacks (STX) tokens. An a
If you want to jump right in to generate and query a new account, try this tutorial:
[@page-reference | inline]
| /understand-stacks/managing-accounts
-> The public-key signature system used for Stacks 2.0 accounts is [Ed25519](https://ed25519.cr.yp.to/).
Assets cannot leave an account without an action from the account owner. All changes to assets (and the balances of the account) require a corresponding transaction.
-> The [transaction type](/understand-stacks/transactions#types) doesn't need to be a token transfer - contract deploy and contract call transactions can change the balances of an account
-> The transaction type doesn't need to be a token transfer - contract deploy and contract call transactions can change the balances of an account
## Creation
@ -53,13 +50,13 @@ stx make_keychain -t > cli_keychain.json
}
```
-> Check out the [Stacks CLI reference](/references/stacks-cli) for more details
-> Check out the [Stacks CLI reference](https://docs.hiro.so/references/stacks-cli) for more details
| Field | Description |
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `mnemonic` | A 24-word seed phrase used to access the account, generated using [BIP39](https://github.com/bitcoin/bips/blob/master/bip-0039.mediawiki) with 256 bits of entropy |
| `keyInfo.address` | Stacks address for the account |
| `keyInfo.privateKey` | Private key for the account. Required for [token transfers](/understand-stacks/transactions#stacks-token-transfer) and often referred to as `senderKey` |
| `keyInfo.privateKey` | Private key for the account. Required for token transfers and often referred to as `senderKey` |
| `keyInfo.index` | Nonce for the account, starting at 0 |
| `keyInfo.btcAddress` | Corresponding BTC address for the account. A construct from the previous blockchain (Stacks 1.0) and currently unused. |
@ -212,5 +209,3 @@ Sample response:
]
}
```
-> Read more about [pagination](/understand-stacks/transactions#pagination) to iterate through the entire result set of the asset events

131
src/pages/understand-stacks/command-line-interface.md

@ -1,131 +0,0 @@
---
title: Stacks CLI
description: Learn more about the Stacks CLI capabilities
---
The Stacks CLI enables interactions with the Stacks 2.0 blockchain through a set of commands. At the current stage, the CLI is intended for developer experimentation on the testnet only.
## Installation
First, ensure you have `npm` installed. Next, run the following command in your terminal:
`npm install -g @stacks/cli`
-> The `-g` flag makes the CLI commands available globally
## Network selection
By default, the CLI will attempt to interact with the mainnet of the Stacks 2.0 blockchain. However, it is possible to override the network and set it to the testnet:
```bash
stx <command> -t
```
-> For account usage, that means adresses generated will _only_ be available for the specific network. An account generated for the testnet cannot be used on the mainnet.
By default, using the `-t` flag causes the CLI to connect to the testnet node at `http://stacks-node-api.blockstack.org:20443`. To specify a node to connect to, add the `-I` flag followed by the URL of the node:
```bash
stx <command> -I "http://localhost:20443"
```
## Account
This section describes how to use the CLI to manage an account.
~> It is not recommended to use the CLI to handle accounts with real STX tokens on the mainnet. Using an appropriate wallet build to support secure token holding is recommended.
### Creating an account
You can generate a new account for testnet by using the `make_keychain` command with the `-t` option:
```bash
stx make_keychain -t
```
Your response should look like this:
```json
{
"mnemonic": "private unhappy random runway boil scissors remove harvest fatigue inherit inquiry still before mountain pet tail mad accuse second milk client rebuild salt chase",
"keyInfo": {
"privateKey": "381314da39a45f43f45ffd33b5d8767d1a38db0da71fea50ed9508e048765cf301",
"address": "ST1BG7MHW2R524WMF7X8PGG3V45ZN040EB9EW0GQJ",
"btcAddress": "n4X37UmRZYk9HawtS1w4xRtqJWhByxiz3c",
"index": 0
}
}
```
The mnemonic is your 24 word seed phrase which you should back up securely if you want access to this account again in the future. Once lost, it cannot be recovered.
The Stacks address associated with the newly generated account is:
`ST1BG7MHW2R524WMF7X8PGG3V45ZN040EB9EW0GQJ`
-> This is a testnet address for use with the testnet only
It is best to store the response of the CLI somewhere. You will need the private key, for instance, to send tokens to others.
### Checking balance
You can check the balance of your account using the following command:
```bash
stx balance ST1BG7MHW2R524WMF7X8PGG3V45ZN040EB9EW0GQJ -t
```
The response will look like this:
```json
{
"balance": "10000",
"nonce": 0
}
```
-> To receive testnet STX tokens, please use the [faucet](https://explorer.stacks.co/sandbox/faucet)
Take note that the nonce for the account is `0`. This number is important for transaction broadcasting.
## Transactions
This section describes how to use the CLI to generate and broadcast transactions.
### Sending Tokens
In order to send tokens, the CLI command requires 5 parameters:
- **Recipient Address**: The Stacks address of the recipient
- **Amount**: The number of Stacks to send denoted in microstacks (1 STX = 1000000 microstacks)
- **Fee Rate**: The transaction fee rate for this transaction. You can safely set a fee rate of 200 for Testnet
- **Nonce**: The nonce is a number that needs to be incremented monotonically for each transaction from the account. This ensures transactions are not duplicated
- **Private Key**: This is the private key corresponding to your account that was generated when
The CLI command to use with these parameters is `send_tokens`:
```bash
stx send_tokens ST2KMMVJAB00W5Z6XWTFPH6B13JE9RJ2DCSHYX0S7 1000 200 0 381314da39a45f43f45ffd33b5d8767d1a38db0da71fea50ed9508e048765cf301 -t
```
```json
{
"txid": "0xd32de0d66b4a07e0d7eeca320c37a10111c8c703315e79e17df76de6950c622c",
"transaction": "https://explorer.stacks.co/txid/0xd32de0d66b4a07e0d7eeca320c37a10111c8c703315e79e17df76de6950c622c"
}
```
With this command we’re sending 1000 microstacks to the Stacks address `ST2KMMVJAB00W5Z6XWTFPH6B13JE9RJ2DCSHYX0S7`.
We set the fee rate to `200` microstacks. If you're not sure how much your transaction will cost.
-> You can add the `-e` flag to estimate the transaction fee needed to get processed by the network, without broadcasting your transaction.
The nonce is set to `0` for this transaction, since it will be the first transaction we send from this account. For subsequent transactions, you will need to increment this number by `1` each time. You can check the current nonce for the account using the `balance` command.
Finally, the last parameter is the private key for the account. `381314da39a45f43f45ffd33b5d8767d1a38db0da71fea50ed9508e048765cf301`
Once again, we’re using the `-t` option to indicate that this is a Testnet transaction, so it should be broadcasted to Testnet.
If valid, the transaction will be broadcasted to the network and the command will respond with a transaction ID.
-> To obtain the raw, serialized transaction payload without broadcasting it, you can add the `-x` flag

231
src/pages/understand-stacks/local-development.md

@ -1,231 +0,0 @@
---
title: Local development
description: Set up and run a mocknet with docker
icon: TestnetIcon
images:
large: /images/pages/testnet.svg
sm: /images/pages/testnet-sm.svg
---
## Introduction
This guide helps you understand how to set up and run a mocknet for local development with Docker.
## Requirements
- [Docker](https://docs.docker.com/get-docker/)
- [docker-compose](https://github.com/docker/compose/releases/) >= `1.27.4`
- [git](https://git-scm.com/downloads)
- [`jq` binary](https://stedolan.github.io/jq/download/)
## Quickstart
1. Clone the repo locally:
```bash
git clone https://github.com/blockstack/stacks-local-dev ./stacks-local-dev && cd ./stacks-local-dev
```
2. Copy sample.env to .env:
```bash
cp sample.env .env
```
3. Start the Mocknet:
```bash
./manage.sh mocknet up
```
4. Stop the Mocknet:
```bash
./manage.sh mocknet down
```
## Env Vars
All variables used in the [`.env`](https://github.com/blockstack/stacks-local-dev/blob/master/.env) file can be modified, but generally most of them should be left as-is.
### Locally opened ports
In this section of the [`.env`](https://github.com/blockstack/stacks-local-dev/blob/master/.env) file, the values can be modified to change the ports opened locally by docker.
Currently, default port values are used - but if you have a running service on any of the defined ports, they can be adjusted to any locally available port.
ex:
```bash
POSTGRES_PORT_LOCAL=5432
EXPLORER_PORT_LOCAL=3000
```
Can be adjusted to:
```bash
POSTGRES_PORT_LOCAL=5433
EXPLORER_PORT_LOCAL=3001
```
Docker will still use the default ports _internally_ - this modification will only affect how the **host** OS accesses the services.
For example, to access postgres (using the **new** port `5433`) after running `./manage.sh mocknet up`:
```bash
export PGPASSWORD='postgres' && psql --host localhost -p 5433 -U postgres -d stacks_node_api
```
### System Resources
All sections in the [`.env`](https://github.com/blockstack/stacks-local-dev/blob/master/.env) file have specific CPU/MEM values, and can be adjusted to work in your local enviroment.
The variables take the form of `xxxx_CPU` and `xxxx_MEM`.
ex:
```bash
STACKS_MINER_CPU=0.3
STACKS_MINER_MEM=128M
STACKS_FOLLOWER_CPU=0.3
STACKS_FOLLOWER_MEM=128M
```
### Bitcoin
Mocknet does not require any settings for bitcoin
### Postgres
Default password is easy to guess, and we do open a port to postgres locally.
This password is defined in the file [./postgres/stacks-node-api.sql](https://github.com/blockstack/stacks-local-dev/blob/master/postgres/stacks-node-api.sql#L1)
If you update this value to something other than `postgres`, you'll have to adjust the value in the [`.env`](https://github.com/blockstack/stacks-local-dev/blob/master/.env) file as well, as the mocknet API uses this:
```bash
POSTGRES_PASSWORD=postgres
```
## Running a local mocknet
### Install/Update docker-compose
First, check if you have `docker-compose` installed locally:
```bash
docker-compose --version
docker-compose version 1.27.4, build 40524192
```
If the command is not found, or the version is < `1.27.4`, run the following to install the latest to `/usr/local/bin/docker-compose`:
```bash
VERSION=$(curl --silent https://api.github.com/repos/docker/compose/releases/latest | jq .name -r)
DESTINATION=/usr/local/bin/docker-compose
sudo curl -L https://github.com/docker/compose/releases/download/${VERSION}/docker-compose-$(uname -s)-$(uname -m) -o $DESTINATION
sudo chmod 755 $DESTINATION
```
### Ensure all images are up to date
You can run the following at anytime to ensure the local images are up to date:
```bash
./manage.sh mocknet pull
```
### Services Running in Mocknet
**Mocknet service names**:
- follower
- api
- postgres
**Docker container names**:
- mocknet_stacks-node-follower
- mocknet_stacks-node-api
- mocknet_postgres
#### Starting Mocknet Services
Start all services:
```bash
./manage.sh mocknet up
```
#### Stopping Mocknet Services
Stop all services:
```bash
./manage.sh mocknet down
```
Or restart:
```bash
./manage.sh mocknet restart
```
#### Retrieving Mocknet logs
Tail logs with docker-compose:
```bash
./manage.sh mocknet logs
```
## Accessing the services
**stacks-node-follower**:
- Ports `20443-20444` are **only** exposed to the `mocknet` docker network.
**stacks-node-api**:
- Ports `3700, 3999` are exposed to `localhost`
```bash
curl localhost:3999/v2/info | jq
```
**postgres**:
- Port `5432` is exposed to `localhost` (PGPASSWORD is defined in [`.env`](https://github.com/blockstack/stacks-local-dev/blob/master/.env))
```bash
export PGPASSWORD='postgres' && psql --host localhost -p 5432 -U postgres -d stacks_node_api
```
## Potential issues
### Port already in use
If you have a port conflict, typically this means you already have a process using that same port.
To resolve, find the port you have in use (for example, `5432` and edit the [`.env`](https://github.com/blockstack/stacks-local-dev/blob/master/.env) file to use the new port)
```bash
netstat -anl | grep 5432
tcp46 0 0 *.5432 *.* LISTEN
```
### Containers not starting
Occasionally, docker can get **stuck** and not allow new containers to start. If this happens, simply restart your docker daemon and try again.
### BNS username not found
The mocknet is launched without the import of Stacks 1.0 name, only the test genesis chain state is imported. To change that comment out the corresponding line in `stacks-node-follower/Config.toml.template` like this:
```
# use_test_genesis_chainstate = true
```
### panic on launch
Verify that the path of the config file is correct in the `.env` file, in particular on Windows OS the slash (`/`) in path names can cause errors.

250
src/pages/understand-stacks/managing-accounts.md

@ -1,250 +0,0 @@
---
title: Managing accounts
description: Learn how to generate and review accounts
icon: TestnetIcon
duration: 15 minutes
experience: beginners
tags:
- tutorial
images:
large: /images/pages/testnet.svg
sm: /images/pages/testnet-sm.svg
---
## Introduction
This tutorial will walk you through the following steps:
- Generating an account
- Reviewing account info
- Reviewing account history
- Getting account balances
-> This tutorial is NodeJS-specific. If you would like to understand how to manage Stacks 2.0 accounts using a different language/framework, please [review the accounts guide](/understand-stacks/accounts).
## Requirements
You will need [NodeJS](https://nodejs.org/en/download/) `8.12.0` or higher to complete this tutorial. You can verify your installation by opening up your terminal and run the following command:
```bash
node --version
```
## Step 1: Installing libraries
First, install all the required libraries:
```bash
npm install --save @stacks/transactions @stacks/blockchain-api-client cross-fetch
```
-> The API client is generated from the [OpenAPI specification](https://github.com/blockstack/stacks-blockchain-api/blob/master/docs/openapi.yaml) ([openapi-generator](https://github.com/OpenAPITools/openapi-generator)). Many other languages and frameworks are be supported by the generator.
## Step 2: Generating an account
To get started, let's generate a new, random Stacks 2.0 private key:
```js
const { fetch } = require('cross-fetch');
const {
makeRandomPrivKey,
privateKeyToString,
getAddressFromPrivateKey,
TransactionVersion,
} = require('@stacks/transactions');
const { AccountsApi, FaucetsApi, Configuration } = require('@stacks/blockchain-api-client');
const apiConfig = new Configuration({
fetchApi: fetch,
// for mainnet, replace `testnet` with `mainnet`
basePath: 'https://stacks-node-api.testnet.stacks.co',
});
const privateKey = makeRandomPrivKey();
```
-> Note: The code above also imports methods required for the next steps, including API configuration for the client library usage.
## Step 3: Reviewing account info
With the private key, you can review account details. First, we need to derive the Stacks address from the private key. Then, we can use the `AccountsApi` class to get the account details:
```js
const stacksAddress = getAddressFromPrivateKey(
privateKeyToString(privateKey),
TransactionVersion.Testnet // remove for Mainnet addresses
);
const accounts = new AccountsApi(apiConfig);
async function getAccountInfo() {
const accountInfo = await accounts.getAccountInfo({
principal: stacksAddress,
});
return accountInfo;
}
```
-> Note: A "principal" is any entity that can have a token balance. Find more details in the [Principals guide](/write-smart-contracts/principals).
The API will respond with a balance, nonce (starting at zero), and respective proofs:
```js
{
balance: '0x00000000000000000000000000000000',
nonce: 0,
balance_proof: '',
nonce_proof: ''
}
```
The `balance` property represents the Stacks token balance, as hex-encoded string of an unsigned 128-bit integer (big-endian). It is not easy to consume the `balance` property in this format. To simplify that, and to obtain all balances for all tokens (Stacks/STX, fungible, and non-fungible), check out [step 5](#step-5-getting-account-balances).
### Disabling proofs
Proofs, provided as hex-encoded strings, can be removed from the responses by setting the `proof` parameter:
```js
async function getAccountInfoWithoutProof() {
const accountInfo = await accounts.getAccountInfo({
principal: stacksAddress,
proof: 0,
});
return accountInfo;
}
```
## Step 4: Reviewing account history
The following step make requires associated accounts transactions. For simplicity, let's run the faucet for the new account:
```js
async function runFaucetStx() {
const faucets = new FaucetsApi(apiConfig);
const faucetTx = await faucets.runFaucetStx({
address: stacksAddress,
});
return faucetTx;
}
```
The API will respond with a new transaction ID and confirmation that the faucet run was successful:
```js
{
success: true,
txId: '0x5b3d9b47c8f0a3c161868c37d94977b3b0a507558a542fd9499b597bfc799d11',
txRaw: '80800000000400164247d6f2b425ac5771423ae6c80c754f717...'
}
```
-> Note: Wait a few minutes for the transaction to complete. You can review the status using the Explorer, by navigating to the following URL: `https://explorer.stacks.co/txid/<txid>`.
Assuming the faucet transaction was successfully processed, you can review the account history. We are expecting at least one transactions to show up in the account history.
```js
async function getAccountTransactions() {
const history = await accounts.getAccountTransactions({
principal: stacksAddress,
});
return history;
}
```
The API will respond with a paginatable list of transactions associated with the account:
```js
{
limit: 20,
offset: 0,
total: 1,
results: [
{
tx_id: '0x89ee63c0',
tx_type: 'token_transfer',
fee_rate: '180',
sender_address: 'STB44HYPYAT2BB2QE513NSP81HTMYWBJP02HPGK6',
sponsored: false,
post_condition_mode: 'deny',
tx_status: 'success',
block_hash: '0x167662a4e',
block_height: 2951,
burn_block_time: 1598910912,
burn_block_time_iso: '2020-08-31T21:55:12.000Z',
canonical: true,
tx_index: 1,
tx_result: {"hex":"0x03","repr":"true"},
token_transfer: {
recipient_address:"STW617CAFYNFQG6G470DNWW4V56XAY7125S3Z6RK",
amount:"500000",
memo:"0x466175636574000000"
},
events: [{ ... }]
}
]
}
```
Please review the [API reference](https://blockstack.github.io/stacks-blockchain-api/#operation/get_account_transactions) for property definitions and details.
### Handling pagination
To make API responses more compact, lists returned by the API are paginated. For lists, the response body includes:
| Parameter | Description | Default |
| --------- | ---------------------------------------------------------- | ------- |
| `limit` | The number of list items returned | 20 |
| `offset` | The number of elements skipped | 0 |
| `total` | The number of all available list items | 0 |
| `results` | Array of list items (length of array equals the set limit) | [] |
In order to paginate throughout the full result set, we can use the `limit` and `offset` request properties. Here is an example where we request transactions 50-100 for an account:
```js
async function getAccountTransactions() {
const history = await accounts.getAccountTransactions({
principal: stacksAddress,
limit: 50,
offset: 50,
});
return history;
}
```
## Step 5: Getting account balances
As mentioned above, any Stacks address can have a variety of tokens and associated balances. In order to get balances for all Stacks, fungible, and non-fungible tokens, we can use the `getAccountBalance` method:
```js
async function getAccountBalance() {
const balances = await accounts.getAccountBalance({
principal: stacksAddress,
});
return balances;
}
```
The API will respond with the following breakdown of token balances:
```js
{
stx: {
balance: '500000',
total_sent: '0',
total_received: '500000'
},
fungible_tokens: {},
non_fungible_tokens: {}
}
```
-> Note: The `balance` field does **not** represent full Stacks (STX) token, but micro-STX. 1,000,000 micro-STX are worth 1 Stacks (STX) token.
We can see that the current Stacks (STX) balance is `500000` micro-STX, or `0.5` Stacks (STX) token.

5
src/pages/understand-stacks/microblocks.md

@ -67,7 +67,7 @@ requires the transaction to be in a microblock, an anchor block, or in either.
### Transactions
Transactions include an option that controls if a miner should include them in microblocks or in anchor blocks. The
[anchor mode][] transaction option is an optional argument that controls whether a transaction must be included in
anchor mode transaction option is an optional argument that controls whether a transaction must be included in
an anchor block or a microblock, or is eligible for either.
### Mining
@ -141,8 +141,7 @@ state as confirmed.
[proof-of-transfer consensus mechanism]: /understand-stacks/proof-of-transfer
[stacks block production model]: https://github.com/stacksgov/sips/blob/main/sips/sip-001/sip-001-burn-election.md#operation-as-a-leader
[mining microblocks]: /understand-stacks/mining#microblocks
[anchor mode]: /understand-stacks/transactions#anchor-mode
[anchormode]: https://stacks-js-git-master-blockstack.vercel.app/enums/transactions.anchormode.html
[stacks blockchain api guide]: /understand-stacks/stacks-blockchain-api#microblocks-support
[stacks blockchain api guide]: https://docs.hiro.so/getting-started/stacks-blockchain-api#microblocks-support
[provides an endpoint]: /stacks-blockchain-api#nonce-handling
[microblocks_api]: https://stacks-blockchain-api-git-feat-microblocks-blockstack.vercel.app/#tag/Microblocks

12
src/pages/understand-stacks/network.md

@ -15,7 +15,7 @@ STX amounts should be stored as integers (8 bytes long), and represent the amoun
## Fees
Fees are used to incentivize miners to confirm transactions on the Stacks 2.0 blockchain. The fee is calculated based on the estimate fee rate and the size of the [raw transaction](/understand-stacks/transactions#serialization) in bytes. The fee rate is a market determined variable. For the [testnet](/understand-stacks/testnet), it is set to 1 micro-STX.
Fees are used to incentivize miners to confirm transactions on the Stacks 2.0 blockchain. The fee is calculated based on the estimate fee rate and the size of the raw transaction in bytes. The fee rate is a market determined variable. For the [testnet](/understand-stacks/testnet), it is set to 1 micro-STX.
Fee estimates can obtained through the [`GET /v2/fees/transfer`](https://blockstack.github.io/stacks-blockchain-api/#operation/get_fee_transfer) endpoint:
@ -46,9 +46,9 @@ Nonces are added to all transactions and help identify them in order to ensure t
-> The consensus mechanism also ensures that transactions aren't "replayed" in two ways. First, nodes query its unspent transaction outputs (UTXOs) in order to satisfy their spending conditions in a new transaction. Second, messages sent between nodes review sequence numbers.
When a new [token transfer transaction](/understand-stacks/transactions#stacks-token-transfer) is constructed, the most recent nonce of the account needs to fetched and set.
When a new token transfer transaction is constructed, the most recent nonce of the account needs to fetched and set.
-> The API provides an endpoint to [simplify nonce handling](/understand-stacks/stacks-blockchain-api#nonce-handling).
-> The API provides an endpoint to [simplify nonce handling](https://docs.hiro.so/get-started/stacks-blockchain-api#nonce-handling).
## Confirmations
@ -82,7 +82,7 @@ The API will respond with the block time (in seconds):
Smart contracts can expose public function calls. For functions that make state modifications to the blockchain, transactions need to be generated and broadcasted.
However, for read-only function calls, transactions are **not** required. Instead, these calls can be done using the [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api).
However, for read-only function calls, transactions are **not** required. Instead, these calls can be done using the [Stacks Blockchain API](https://docs.hiro.so/get-started/stacks-blockchain-api).
-> Read-only function calls do not require transaction fees
@ -107,11 +107,11 @@ Sample response for a successful call:
}
```
-> To set the function call arguments and read the result, [Clarity values](/understand-stacks/transactions#clarity-value-types) need to be serialized into a hexadecimal string. The [Stacks Transactions JS](https://github.com/blockstack/stacks.js/tree/master/packages/transactions) library supports these operations
-> To set the function call arguments and read the result, [Clarity values](/write-smart-contracts/values) need to be serialized into a hexadecimal string. The [Stacks Transactions JS](https://github.com/blockstack/stacks.js/tree/master/packages/transactions) library supports these operations
## Querying
Stacks 2.0 network details can be queried using the [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api).
Stacks 2.0 network details can be queried using the [Stacks Blockchain API](https://docs.hiro.so/get-started/stacks-blockchain-api).
### Health check

24
src/pages/understand-stacks/overview.md

@ -73,13 +73,6 @@ Clarity is distinct from other languages designed for writing smart contracts in
[@page-reference | inline]
| /write-smart-contracts/overview
### Decentralized apps
Stacks 2.0 enabled building decentralized apps that are user-owned and avoid centralized servers.
[@page-reference | grid]
| /build-apps/overview, /build-apps/guides/authentication, /build-apps/guides/transaction-signing, /build-apps/guides/data-storage
## Guides
Read one of our guides to understand the ins and outs of the Stacks 2.0 blockchain.
@ -87,21 +80,4 @@ Read one of our guides to understand the ins and outs of the Stacks 2.0 blockcha
[@page-reference | grid-small]
| /understand-stacks/accounts, /understand-stacks/transactions, /understand-stacks/network, /understand-stacks/microblocks
## Try it out
Ready to get started with Stacks? Try one of our existing tutorials:
[@page-reference | grid-small]
| /understand-stacks/managing-accounts, /understand-stacks/sending-tokens, /understand-stacks/running-testnet-node, /understand-stacks/running-regtest-node, /understand-stacks/integrate-stacking
## Developer tooling
Developing on the Stacks blockchain is much simpler with the our tooling.
- **Network health checker**: Check the status of the network with the [status checker](/understand-stacks/network#health-check)
- **Explorer**: View accounts, blocks, transactions, and smart contracts broadcasted to the Stacks blockchain using the [Stacks Explorer](https://explorer.stacks.co/)
- **API**: Read and interact with the blockchain and with smart contract using the [Stacks 2.0 Blockchain API](/understand-stacks/stacks-blockchain-api)
- **CLI**: Use the [Stacks CLI](/understand-stacks/command-line-interface) inside your terminal
- **Libraries**: Use the [Stacks Javascript libraries](https://blockstack.github.io/stacks.js/) to integrate with the Stacks blockchain
[comparison of the stacks blockchain to other blockchain technologies]: https://stacks.org/stacks-blockchain

36
src/pages/understand-stacks/regtest.md

@ -1,36 +0,0 @@
---
title: Regtest
description: Test your smart contracts and apps
images:
large: /images/pages/regtest.svg
sm: /images/pages/regtest-sm.svg
---
## About regtest
The regtest is a separate blockchain from the Stacks mainnet analogous to a development environment. Similar to the testnet, it's a network used by developers to test their apps, smart contracts, or changes to the protocol in a production-like environment. However, it differs by producing a new BTC and STX block every 2 minutes, making it much more suitable for rapid development. The regtest is reset more regularly than testnet.
## Regtest nodes
If you would like to run your own regtest node, please follow these steps:
[@page-reference | inline]
| /understand-stacks/running-regtest-node
## Regtest API
The hosted [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api) for the regtest is available at this base URL:
```shell
https://stacks-node-api.regtest.stacks.co/
```
### Faucet
The regtest faucet provides you with free Stacks Token (STX) to test with. These are not the same as STX on mainnet and have no value. You can get STX from the faucet on the [Stacks Explorer Sandbox](https://explorer.stacks.co/sandbox/faucet?chain=testnet), or using the [API](https://blockstack.github.io/stacks-blockchain-api/#tag/Faucets).
The Explorer does not yet list `regtest` as an available network by default. Before requesting STX tokens from the Explorer, you'll have to first add the `regtest` network to the Explorer by selecting Network, then "Add a network" in the top right. The URL of the regtest API is `[https://stacks-node-api.regtest.stacks.co](https://stacks-node-api.regtest.stacks.co)`
Once completed, navigate to the "Faucet" tab from the link above and click on "Request STX" button. If you would like to get enough STX tokens to try out [Stacking](/understand-stacks/stacking), you should click on "I want to stack."
> The Explorer Sandbox requires you to login with a Secret Key

303
src/pages/understand-stacks/running-api-node.md

@ -1,303 +0,0 @@
---
title: Running an API node
description: Set up and run a local API node with Docker
icon: MainnetIcon
images:
large: /images/pages/mainnet.svg
sm: /images/pages/mainnet-sm.svg
---
## Introduction
This procedure demonstrates how to run a local API node using Docker images. There are several components that must be
configured and run in a specific order for the local API node to work.
For this procedure, the order in which the services are brought up is very important. In order to start the API node
successfully, you need to bring up the services in the following order:
1. `postgres`
2. `stacks-blockchain-api`
3. `stacks-blockchain`
When bringing down the API node, you should bring the services down in the exact reverse order in which they were
brought up, to avoid losing data.
-> This procedure focuses on Unix-like operating systems (Linux and MacOS). This procedure has not been tested on
Windows.
## Prerequisites
Running a node has no specialized hardware requirements. Users have been successful in running nodes on Raspberry Pi
boards and other system-on-chip architectures. In order to complete this procedure, you must have the following software
installed on the node host machine:
- [Docker](https://docs.docker.com/get-docker/)
- [curl](https://curl.se/download.html)
- [psql](http://postgresguide.com/utilities/psql.html) (_installed locally_)
- [jq](https://stedolan.github.io/jq/download/)
### Firewall configuration
In order for the API node services to work correctly, you must configure any network firewall rules to allow traffic on
the ports discussed in this section. The details of network and firewall configuration are highly specific to your
machine and network, so a detailed example isn't provided.
The following ports must open on the host machine:
Ingress:
- postgres (open to `localhost` only):
- `5432 TCP`
- stacks-blockchain-api
- `3999 TCP`
- stacks-blockchain (open to `0.0.0.0/0`):
- `20443 TCP`
- `20444 TCP`
Egress:
- `8332`
- `8333`
- `20443-20444`
These egress ports are for syncing [`stacks-blockchain`][] and Bitcoin headers. If they're not open, the sync will fail.
## Step 1: initial setup
In order to run the API node, you must download the Docker images and create a directory structure to hold the
persistent data from the services. Download and configure the Docker images with the following commands:
```sh
docker pull blockstack/stacks-blockchain-api && docker pull blockstack/stacks-blockchain && docker pull postgres:alpine
docker network create stacks-blockchain > /dev/null 2>&1
```
Create a directory structure for the service data with the following command:
```sh
mkdir -p ./stacks-node/{persistent-data/postgres,persistent-data/stacks-blockchain,bns,config} && cd stacks-node
```
## Step 2: running Postgres
The `postgres:alpine` Docker container can be run with default settings. You must set the password for the user to
`postgres` with the `POSTGRES_PASSWORD` environment variable. The following command starts the image:
```sh
docker run -d --rm \
--name postgres \
--net=stacks-blockchain \
-e POSTGRES_PASSWORD=postgres \
-v $(pwd)/persistent-data/postgres:/var/lib/postgresql/data \
-p 5432:5432
postgres:alpine
```
You can verify the running Postgres instance on port `5432` with the command
```sh
docker ps --filter name=postgres
```
## Step 3: running Stacks blockchain API
The [`stacks-blockchain-api`][] image requires several environment variables to be set. To reduce the complexity of the
run command, you should create a new `.env` file and add the following to it using a text editor:
```
NODE_ENV=production
GIT_TAG=master
PG_HOST=postgres
PG_PORT=5432
PG_USER=postgres
PG_PASSWORD=postgres
PG_DATABASE=postgres
STACKS_CHAIN_ID=0x00000001
V2_POX_MIN_AMOUNT_USTX=90000000260
STACKS_CORE_EVENT_PORT=3700
STACKS_CORE_EVENT_HOST=0.0.0.0
STACKS_BLOCKCHAIN_API_PORT=3999
STACKS_BLOCKCHAIN_API_HOST=0.0.0.0
STACKS_BLOCKCHAIN_API_DB=pg
STACKS_CORE_RPC_HOST=stacks-blockchain
STACKS_CORE_RPC_PORT=20443
BNS_IMPORT_DIR=/bns-data
```
-> This guide configures the API to import BNS data with the `BNS_IMPORT_DIR` variable. To disable this import, comment
the line out by placing a `#` at the beginning of the line. If you leave the BNS import enabled, it may take several
minutes for the container to start while it imports the data.
The `PG_HOST` and `STACKS_CORE_RPC_HOST` variables define the container names for `postgres` and `stacks-blockchain`.
You may wish to alter those values if you have named those containers differently than this guide.
Start the [`stacks-blockchain-api`][] image with the following command:
```sh
docker run -d --rm \
--name stacks-blockchain-api \
--net=stacks-blockchain \
--env-file $(pwd)/.env \
-v $(pwd)/bns:/bns-data \
-p 3700:3700 \
-p 3999:3999 \
blockstack/stacks-blockchain-api
```
You can verify the running `stacks-blockchain-api` container with the command:
```sh
docker ps --filter name=stacks-blockchain-api
```
## Step 4: running Stacks blockchain
In order for the API to be functional, the [`stacks-blockchain-api`][] container must have data from a running
[`stacks-blockchain`][] instance. First create the `./config/Config.toml` file and add the following content to the
file using a text editor:
```toml
[node]
working_dir = "/root/stacks-node/data"
rpc_bind = "0.0.0.0:20443"
p2p_bind = "0.0.0.0:20444"
bootstrap_node = "02da7a464ac770ae8337a343670778b93410f2f3fef6bea98dd1c3e9224459d36b@seed-0.mainnet.stacks.co:20444,02afeae522aab5f8c99a00ddf75fbcb4a641e052dd48836408d9cf437344b63516@seed-1.mainnet.stacks.co:20444,03652212ea76be0ed4cd83a25c06e57819993029a7b9999f7d63c36340b34a4e62@seed-2.mainnet.stacks.co:20444"
wait_time_for_microblocks = 10000
[[events_observer]]
endpoint = "stacks-blockchain-api:3700"
retry_count = 255
events_keys = ["*"]
[burnchain]
chain = "bitcoin"
mode = "mainnet"
peer_host = "bitcoin.blockstack.com"
username = "blockstack"
password = "blockstacksystem"
rpc_port = 8332
peer_port = 8333
[connection_options]
read_only_call_limit_write_length = 0
read_only_call_limit_read_length = 100000
read_only_call_limit_write_count = 0
read_only_call_limit_read_count = 30
read_only_call_limit_runtime = 1000000000
```
The `[[events_observer]]` block configures the instance to send blockchain events to the API container that you
started previously.
Start the [`stacks-blockchain`][] container with the following command:
```sh
docker run -d --rm \
--name stacks-blockchain \
--net=stacks-blockchain \
-v $(pwd)/persistent-data/stacks-blockchain:/root/stacks-node/data \
-v $(pwd)/config:/src/stacks-node \
-p 20443:20443 \
-p 20444:20444 \
blockstack/stacks-blockchain \
/bin/stacks-node start --config /src/stacks-node/Config.toml
```
You can verify the running [`stacks-blockchain`][] container with the command:
```sh
docker ps --filter name=stacks-blockchain
```
## Step 5: verifying the services
You can now verify that each of the services is running and talking to the others.
To verify the database is ready:
1. Connect to the Postgres instance with the command `psql -h localhost -U postgres`. Use the password from the
`POSTGRES_PASSWORD` environment variable you set when running the container.
2. List current databases with the command `\l`
3. Disconnect from the database with the command `\q`
To verify the [`stacks-blockchain`][] tip height is progressing use the following command:
```sh
curl -sL localhost:20443/v2/info | jq
```
If the instance is running you should recieve terminal output similar to the following:
```json
{
"peer_version": 402653184,
"pox_consensus": "89d752034e73ed10d3b97e6bcf3cff53367b4166",
"burn_block_height": 666143,
"stable_pox_consensus": "707f26d9d0d1b4c62881a093c99f9232bc74e744",
"stable_burn_block_height": 666136,
"server_version": "stacks-node 2.0.11.1.0-rc1 (master:67dccdf, release build, linux [x86_64])",
"network_id": 1,
"parent_network_id": 3652501241,
"stacks_tip_height": 61,
"stacks_tip": "e08b2fe3dce36fd6d015c2a839c8eb0885cbe29119c1e2a581f75bc5814bce6f",
"stacks_tip_consensus_hash": "ad9f4cb6155a5b4f5dcb719d0f6bee043038bc63",
"genesis_chainstate_hash": "74237aa39aa50a83de11a4f53e9d3bb7d43461d1de9873f402e5453ae60bc59b",
"unanchored_tip": "74d172df8f8934b468c5b0af2efdefe938e9848772d69bcaeffcfe1d6c6ef041",
"unanchored_seq": 0,
"exit_at_block_height": null
}
```
Verify the [`stacks-blockchain-api`][] is receiving data from the [`stacks-blockchain`][] with the following command:
```sh
curl -sL localhost:3999/v2/info | jq
```
If the instance is configured correctly, you should recieve terminal output similar to the following:
```json
{
"peer_version": 402653184,
"pox_consensus": "e472cadc17dcf3bc1afafc6aa595899e55f25b72",
"burn_block_height": 666144,
"stable_pox_consensus": "6a6fb0aa75a8acd4919f56c9c4c81ce5bc42cac1",
"stable_burn_block_height": 666137,
"server_version": "stacks-node 2.0.11.1.0-rc1 (master:67dccdf, release build, linux [x86_64])",
"network_id": 1,
"parent_network_id": 3652501241,
"stacks_tip_height": 61,
"stacks_tip": "e08b2fe3dce36fd6d015c2a839c8eb0885cbe29119c1e2a581f75bc5814bce6f",
"stacks_tip_consensus_hash": "ad9f4cb6155a5b4f5dcb719d0f6bee043038bc63",
"genesis_chainstate_hash": "74237aa39aa50a83de11a4f53e9d3bb7d43461d1de9873f402e5453ae60bc59b",
"unanchored_tip": "74d172df8f8934b468c5b0af2efdefe938e9848772d69bcaeffcfe1d6c6ef041",
"unanchored_seq": 0,
"exit_at_block_height": null
}
```
Once the API is running, you can use it to [interact with other API endpoints][`stacks-blockchain-api`].
## Stopping the API node
As discussed previously, if you want to bring down your API node, you must stop the services in the reverse order that
you started them. Performing the shutdown in this order ensures that you don't lose any data while shutting down
the node.
Use the following commands to stop the local API node:
```sh
docker stop stacks-blockchain
docker stop stacks-blockchain-api
docker stop postgres
```
## Additional reading
- [Running an API instance with Docker][] in the `stacks-blockchain-api` repo
- [Running an API instance from source][] in the `stacks-blockchain-api` repo
[running an api instance with docker]: https://github.com/blockstack/stacks-blockchain-api/blob/master/running_an_api.md
[running an api instance from source]: https://github.com/blockstack/stacks-blockchain-api/blob/master/running_api_from_source.md
[`stacks-blockchain`]: https://github.com/blockstack/stacks-blockchain
[`stacks-blockchain-api`]: https://github.com/blockstack/stacks-blockchain-api

2
src/pages/understand-stacks/running-mainnet-node.md

@ -172,5 +172,5 @@ docker stop stacks-blockchain
- [Running an API instance with Docker][]
[running a testnet node with docker]: /understand-stacks/running-testnet-node
[running an api instance with docker]: /understand-stacks/running-api-node
[running an api instance with docker]: https://docs.hiro.so/get-started/running-api-node
[`stacks-blockchain`]: https://github.com/blockstack/stacks-blockchain

251
src/pages/understand-stacks/running-regtest-node.md

@ -1,251 +0,0 @@
---
title: Running a regtest node
description: Learn how to set up and run a regtest node
icon: RegtestIcon
duration: 15 minutes
experience: beginners
tags:
- tutorial
images:
large: /images/cli.svg
sm: /images/cli.svg
---
## Introduction
-> Note: The Stacks 2.0 regtest is similar to the testnet, however BTC and STX blocks are produced at a much faster rate at 1 block every 2 minutes. Making it ideal for rapid development.
-> **Warning:** There is an [open issue](https://github.com/blockstack/stacks-blockchain/issues/2596) to address degraded performance when syncing a stacks-node with the regtest network. Until this issue is closed, syncing a node on the regtest network may take longer than syncing a testnet or mainnet node. If only using the [regtest API](https://stacks-node-api.regtest.stacks.co/extended/v1/status), this issue should not affect you.
This tutorial will walk you through the following steps:
- Download and install the node software
- Run the node against regtest
## Requirements
In order to run a node, some software and hardware requirements need to be considered.
### Hardware
Running a node has no specialized hardware requirements. People were successful at running a node on Raspberry Pis, for instance.
Minimum requirements are moving targets due to the nature of the project and some factors should be considered:
- compiling node sources locally requires computing and storage resources
- as the chain grows, the on-disk state will grow over time
With these considerations in mind, we suggest hardware based on a general-purpose specification, similarly to [GCP E2 machine standard 2](https://cloud.google.com/compute/docs/machine-types#general_purpose) or [AWS EC2 t3.large standard](https://aws.amazon.com/ec2/instance-types/):
- 2 vCPUs
- 8 GB memory
- ~50-GB disk (preferably SSDs)
It is also recommended to run the node with a publicly routable IP, that way other peers in the network will be able to connect to it.
### Software
If you use Linux, you may need to manually install [`libssl-dev`](https://wiki.openssl.org/index.php/Libssl_API) and other packages. In your command line, run the following to get all packages:
```bash
sudo apt-get install build-essential cmake libssl-dev pkg-config
```
Ensure that you have Rust installed. If you are using macOS, Linux, or another Unix-like OS, run the following. If you are on a different OS, follow the [official Rust installation guide](https://www.rust-lang.org/tools/install).
```bash
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
```
In case you just installed Rust, you will be prompted to run the following command to make the `cargo` command available:
```bash
source $HOME/.cargo/env
```
## Installing the node from pre-built binary
### Step 1: Get the distributable
Download and unzip the distributable which cooresponds to your environment [from the latest release](https://github.com/blockstack/stacks-blockchain/releases/latest).
If you're running on Windows, [please follow our instructions from installing a node on Windows.](#running-the-regtest-node-on-windows)
### Step 2: Run the binary
To run the `stacks-node` binary, execute the following:
```bash
./stacks-node krypton
```
**Awesome. Your node is now connected to the regtest network.**
Your node will receive new blocks when they are produced, and you can use the [Stacks Node RPC API](/understand-stacks/stacks-blockchain-api#proxied-stacks-node-rpc-api-endpoints) to send transactions, fetch information for contracts and accounts, and more.
## Installing the node from source
You might want to build and install from source if there are some updates in the [main branch](https://github.com/blockstack/stacks-blockchain) which aren't yet released, or if there is no pre-built binary for your environment.
### Step 1: Install the node
Clone this repository:
```bash
git clone https://github.com/blockstack/stacks-blockchain.git; cd stacks-blockchain
```
Change the below values to reflect the version, branch, and git commit of the source code being built for accuracy:
```bash
# The following values are just an example
export STACKS_NODE_VERSION=2.0.9
export GIT_BRANCH=master
export GIT_COMMIT=e7f178b
```
Install the Stacks node by running:
```bash
cargo build --workspace --release --bin stacks-node
# binary will be in target/release/stacks-node
```
To install Stacks node with extra debugging symbols, run:
```bash
cargo build --workspace --bin stacks-node
# binary will be in target/debug/stacks-node
```
-> This process will take a few minutes to complete
### Step 2: Run the node
You're all set to run a node that connects to the regtest network.
If installed without debugging symbols, run:
```bash
target/release/stacks-node krypton
```
If installed with debugging symbols, run:
```bash
target/debug/stacks-node krypton
```
The first time you run this, you'll see some logs indicating that the Rust code is being compiled. Once that's done, you should see some logs that look something like the this:
```bash
INFO [1588108047.585] [src/chainstate/stacks/index/marf.rs:732] First-ever block 0f9188f13cb7b2c71f2a335e3a4fc328bf5beb436012afca590b1a11466e2206
```
## Running the regtest node on Windows
### Prerequisites
Before you begin, check that you have the below necessary softwares installed on your PC
- [Microsoft C++ Build Tools](https://visualstudio.microsoft.com/visual-cpp-build-tools/).
-> **Tip**: While installing the Microsoft Visual Studio Build tools using the above link, select the C++ Build tools option when prompted.
![C++ Build Tools](/images/C++BuildTools.png)
- [NodeJs](https://nodejs.org/en/download/).
- [Git](https://git-scm.com/downloads).
#### Optional Dependencies
- [Python](https://www.python.org/downloads/).
- [Rust](https://www.rust-lang.org/tools/install).
### Download the Binary and run the follower node
-> **Note**: Please make sure to download the new Binary and follow the below steps as and when a [new release build](https://github.com/blockstack/stacks-blockchain/releases/latest) is available.
First, Visit the [Stacks Github releases repo](https://github.com/blockstack/stacks-blockchain/releases/latest). From the various binary list, click to download the Windows binary. Refer the image below.
![BinaryList](/images/mining-windows.png)
Next, click on save file and Press **Ok** in the popup window.
![Windowspopup](/images/mining-windows-popup.png)
Once saved, Extract the binary. Open the command prompt **from the folder where binary is extracted** and execute the below command:
```bash
stacks-node krypton
# This command will start the regtest follower node.
```
-> **Note** : While starting the node for the first time, windows defender will pop up with a message to allow access. If so, allow access to run the node.
![Windows Defender](/images/windows-defender.png)
To execute Stacks node with extra debugging enabled, run:
```bash
set RUST_BACKTRACE=full
set STACKS_LOG_DEBUG=1
stacks-node krypton
# This command will execute the binary and start the follower node with debug enabled.
```
The first time you run this, you'll see some logs indicating that the Rust code is being compiled. Once that's done, you should see some logs that look something like the this:
```bash
INFO [1588108047.585] [src/chainstate/stacks/index/marf.rs:732] First-ever block 0f9188f13cb7b2c71f2a335e3a4fc328bf5beb436012afca590b1a11466e2206
```
**Awesome. Your node is now connected to the regtest network.**
## Optional: Running with Docker
Alternatively, you can run the regtest node with Docker.
-> Ensure you have [Docker](https://docs.docker.com/get-docker/) installed on your machine.
```bash
docker run -d \
--name stacks_follower \
-p 20443:20443 \
-p 20444:20444 \
blockstack/stacks-blockchain \
stacks-node krypton
```
-> To enable debug logging, add the ENV VARS `RUST_BACKTRACE="full"` and `STACKS_LOG_DEBUG="1"`.
You can review the node logs with this command:
```bash
docker logs -f stacks_follower
```
## Optional: Running in Kubernetes with Helm
In addition, you're also able to run a regtest node in a Kubernetes cluster using the [stacks-blockchain Helm chart](https://github.com/blockstack/stacks-blockchain/tree/master/deployment/helm/stacks-blockchain).
Ensure you have the following prerequisites installed on your machine:
- [minikube](https://minikube.sigs.k8s.io/docs/start/) (Only needed if standing up a local Kubernetes cluster)
- [kubectl](https://kubernetes.io/docs/tasks/tools/install-kubectl/)
- [helm](https://helm.sh/docs/intro/install/)
To install the chart with the release name `my-release` and run the node as a follower:
```bash
minikube start # Only run this if standing up a local Kubernetes cluster
helm repo add blockstack https://charts.blockstack.xyz
helm install my-release blockstack/stacks-blockchain
```
You can review the node logs with this command:
```bash
kubectl logs -l app.kubernetes.io/name=stacks-blockchain
```
For more information on the Helm chart and configuration options, please refer to the [chart's homepage](https://github.com/blockstack/stacks-blockchain/tree/master/deployment/helm/stacks-blockchain).
## Optional: Mining Stacks token
Mining is not currently available on the Stacks regtest.

2
src/pages/understand-stacks/running-testnet-node.md

@ -143,5 +143,5 @@ docker stop stacks-blockchain
- [Running an API instance with Docker][]
[running a mainnet node with docker]: /understand-stacks/running-mainnet-node
[running an api instance with docker]: /understand-stacks/running-api-node
[running an api instance with docker]: https://docs.hiro.do/get-started/running-api-node
[`stacks-blockchain`]: https://github.com/blockstack/stacks-blockchain

225
src/pages/understand-stacks/sending-tokens.md

@ -1,225 +0,0 @@
---
title: Sending tokens
description: Learn how to transfer tokens
icon: TestnetIcon
duration: 15 minutes
experience: beginners
tags:
- tutorial
images:
large: /images/pages/testnet.svg
sm: /images/pages/testnet-sm.svg
---
## Introduction
This tutorial walks you through the following steps:
- Specifying a sender key
- Generating a token transfer transaction
- Broadcasting the transaction to the network
- Checking transaction completion
- Confirming updates account balances (optional)
-> This tutorial is NodeJS-specific. If you would like to understand how to initiate a token transfer by constructing and broadcasting transactions using a different language/framework, please [review the transactions guide](/understand-stacks/transactions).
## Requirements
You will need [NodeJS](https://nodejs.org/en/download/) `8.12.0` or higher to complete this tutorial. You can verify your installation by opening up your terminal and run the following command:
```bash
node --version
```
You should also complete the [Managing accounts tutorial](/understand-stacks/managing-accounts). The following steps assume we have access to an existing Stacks 2.0 account.
## Step 1: Installing libraries
First, install all the required libraries:
```bash
npm install --save @stacks/transactions bn.js @stacks/blockchain-api-client cross-fetch
```
-> The API client is generated from the [OpenAPI specification](https://github.com/blockstack/stacks-blockchain-api/blob/master/docs/openapi.yaml) ([openapi-generator](https://github.com/OpenAPITools/openapi-generator)). Many other languages and frameworks are be supported by the generator.
## Step 2: Specifying a sender
In order to build and sign transactions, you will need a Stacks private key. You can easily generate a new, random Stacks 2.0 sender key (see ["Generating an account" from the previous tutorial](/understand-stacks/managing-accounts#step-2-generating-an-account)).
For this tutorial, we will use an existing Stacks account and instantiate the key object from a private key string:
```js
import fetch from 'cross-fetch';
const BN = require('bn.js');
const {
makeSTXTokenTransfer,
createStacksPrivateKey,
broadcastTransaction,
estimateTransfer,
getNonce,
privateKeyToString,
} = require('@stacks/transactions');
const { StacksTestnet, StacksMainnet } = require('@stacks/network');
const { TransactionsApi, Configuration } = require('@stacks/blockchain-api-client');
const apiConfig = new Configuration({
fetchApi: fetch,
// for mainnet, replace `testnet` with `mainnet`
basePath: 'https://stacks-node-api.testnet.stacks.co',
});
const key = 'edf9aee84d9b7abc145504dde6726c64f369d37ee34ded868fabd876c26570bc01';
const senderKey = createStacksPrivateKey(key);
```
-> Note: The code above also imports methods required for the next steps, including API configuration for the client library usage.
## Step 3: Generating transaction
To generate a token transfer transaction, we will be using the `makeSTXTokenTransfer()` transaction builder function:
```js
const recipient = 'SP3FGQ8Z7JY9BWYZ5WM53E0M9NK7WHJF0691NZ159';
// amount of Stacks (STX) tokens to send (in micro-STX). 1,000,000 micro-STX are worth 1 Stacks (STX) token
const amount = new BN(1000000);
// skip automatic fee estimation
const fee = new BN(2000);
// skip automatic nonce lookup
const nonce = new BN(0);
// override default setting to broadcast to the Testnet network
// for mainnet, use `StacksMainnet()`
const network = new StacksTestnet();
const memo = 'hello world';
const txOptions = {
recipient,
amount,
fee,
nonce,
senderKey: privateKeyToString(senderKey),
network,
memo,
};
...
const transaction = await makeSTXTokenTransfer(txOptions);
```
The generation method will need a few more pieces of information, as specified in the `txOptions` object:
| Parameter | Description | Optional |
| ------------------ | -------------------------------------------------------------------------------------------------------------------------------- | -------- |
| `recipientAddress` | The recipient Stacks address in c32check format | **No** |
| `amount` | The amount of Stacks tokens to send denominated in microstacks | **No** |
| `fee` | The fee that the sender is willing to pay for miners to process the transaction. Denominated in microstacks | Yes |
| `nonce` | A nonce is an integer that needs to be incremented by 1 for each sequential transaction from the same account. Nonces start at 0 | Yes |
| `senderKey` | A private key object | Yes |
| `network` | Specifies whether the transaction is meant for Stacks Mainnet or Testnet | Yes |
| `memo` | A memo string to attach additional information to the transaction. This data is limited to 33 bytes | Yes |
### Estimating fees
If not specified, the transaction builder will automatically estimate the fee. Estimated fee rate is supplied by a Stacks node so network access is required.
-> Learn more about fees in the [network guide](/understand-stacks/network#fees)
Another way to estimate the fee is to use the `estimateTransfer()` function after you have constructed a transaction:
```js
// get fee
const feeEstimate = estimateTransfer(transaction);
// set fee manually
transaction.setFee(feeEstimate);
```
-> Note: By setting a fee in the transaction builder function, the automatic fee estimation step will be skipped.
### Handling nonces
If not specified, the transaction builder will automatically lookup the latest nonce for the sender account. Automatic nonce handling also requires network access. The nonce should be tracked locally when creating multiple sequential transactions from the same account. A Stacks node only updates the nonce once a transaction has been mined.
The updated nonce for each account can be retrieved manually using the `getNonce()` function:
```js
const senderAddress = 'SJ2FYQ8Z7JY9BWYZ5WM53SKR6CK7WHJF0691NZ942';
const senderNonce = getNonce(senderAddress);
```
## Step 4: Broadcasting transaction
Next, we will broadcast the transaction to the Testnet using the `network` object we created earlier:
```js
const broadcastResponse = await broadcastTransaction(transaction, network);
const txID = broadcastResponse.txid;
```
As soon as the `broadcastTransaction` is completed, a JSON object with the transaction ID (`txid`) is returned.
~> Keep in mind that the existence of a transaction ID does not mean the transaction has been successfully processed. Please review the [transaction lifecycle](/understand-stacks/transactions#lifecycle) for more details.
### Serializing transactions
In case you would like to inspect the raw serialized transaction, you can call the `serialize()` method:
```js
const serializedTx = transaction.serialize().toString('hex');
```
## Step 5: Checking completion
With the transaction ID, we can check the status of the transaction. Every transaction needs to be confirmed by the network and will be `pending` as soon as it is broadcasted.
-> Note: A transactions is completed once it is confirmed and the status changes to `success`. Most transactions will be pending for several minutes before confirmed. You should implement polling in your app to refresh the status display.
```js
const transactions = new TransactionsApi(apiConfig);
const txInfo = await transactions.getTransactionById({
txId,
});
console.log(txInfo);
```
The API will respond with transaction details, including the `tx_status` property:
```js
{
tx_id: '0x5f5318',
tx_type: 'token_transfer',
fee_rate: '180',
sender_address: 'STB44HYPYAT2BB2QE513NSP81HTMYWBJP02HPGK6',
sponsored: false,
post_condition_mode: 'deny',
tx_status: 'success',
block_hash: '0xe9b93259',
block_height: 2977,
burn_block_time: 1598915954,
burn_block_time_iso: '2020-08-31T23:19:14.000Z',
canonical: true,
tx_index: 1,
tx_result: { hex: '0x03', repr: 'true' },
token_transfer: {
recipient_address: 'ST9SW39M98MZXBGWSDVN228NW1NWENWCF321GWMK',
amount: '500000',
memo: '0x4661756'
},
events: [ { event_index: 0, event_type: 'stx_asset', asset: [ ... ] } ]
}
```
For all property formats and details, please review the [API reference](https://blockstack.github.io/stacks-blockchain-api/#operation/get_transaction_by_id).
## Step 6: Confirming balance (optional)
Now that the token transfer is confirmed, we can verify the new account balance on the sender address by [following the "Getting account balances" steps from the previous tutorial](/understand-stacks/managing-accounts#step-5-getting-account-balances).

151
src/pages/understand-stacks/stacking-using-CLI.md

@ -1,151 +0,0 @@
---
title: Stacking using the Stacks CLI
description: Learn how use the Stacks CLI to participate in Stacking
experience: beginner
duration: 10 minutes
tags:
- tutorial
images:
sm: /images/pages/stacking-rounded.svg
---
## Introduction
!> The Stacking implementation is still in development and could change in the coming weeks
In this tutorial, you'll learn how to use the Stacks CLI to participate in Stacking. The CLI is a great way to quickly try out Stacking on testnet. To integrate Stacking into your application, see the [Stacking integration guide](/stacks-blockchain/integrate-stacking).
## Requirements
First, you'll need to understand the [Stacking mechanism](/stacks-blockchain/stacking).
You'll also need [NodeJS](https://nodejs.org/en/download/) `12.10.0` or higher to complete this tutorial. You can verify your installation by opening up your terminal and run the following command:
```bash
node --version
```
You will also need to install the Stacks CLI from NPM:
```bash
npm install @stacks/cli -g
```
## Generate An Account
```bash
stx make_keychain -t
```
```json
{
"mnemonic": "turn food juice small swing junior trip crouch slot wood nephew own tourist hazard tomato follow trust just project traffic spirit oil diary blue",
"keyInfo": {
"privateKey": "dca82f838f6e5a893cffc8efe861196252373a5b8b62c0b55ba3a0a7a28795d301",
"address": "ST1P3HXR80TKT48TKM2VTKCDBS4ET9396W0W2S3K8",
"btcAddress": "mqPBWmSGJhrA9x5XJC6qJtFsHnudqZ2XJU",
"index": 0
}
}
```
We'll be using this testnet key pair to perform Stacking. But first we'll need to get some testnet tokens.
-> If you're Stacking on mainnet, make sure you have an account with sufficient number of Stacks tokens to participate.
## Get Testnet Tokens Using The Faucet
Use the following `curl` command to request tokens from the testnet node's faucet endpoint.
We use the address generated above as a parameter.
```bash
curl -X POST https://stacks-node-api.xenon.blockstack.org/extended/v1/faucets/stx?address=ST1P3HXR80TKT48TKM2VTKCDBS4ET9396W0W2S3K8&stacking=true
```
## Check Balance
Confirm that the faucet transaction has completed by checking the balance of your address. The `-t` flag is used to indicate testnet. See the [CLI reference](/references/stacks-cli) for usage of flags.
```bash
stx balance ST1P3HXR80TKT48TKM2VTKCDBS4ET9396W0W2S3K8 -t
{
"balance": "90000000000000",
"locked": "0",
"unlock_height": 0,
"nonce": 0
}
```
## Check Stacking Eligibility
Before we send the Stacking transaction, we will need to check if we're eligible for Stacking.
This check ensures that we meet the minimum threshold amount of STX required in order to participate in Stacking.
The arguments required for the `can_stack` command are:
| Parameter | Description | Value |
| --------------- | ------------------------------------------------------------------------------------- | ------------------------------------------- |
| `Amount` | Amount to stack in microstacks, we'll use the entire balance in our account | `90000000000000` |
| `Reward cycles` | Number of reward cycles to lock up your tokens for Stacking | `10` |
| `BTC Address` | BTC address to receive Stacking rewards. We can use any valid BTC address | `mqkccNX5h7Xy1YUku3X2fCFCC54x6HEiHk` |
| `STX Address` | The address that we will be Stacking with. We'll use the address generated previously | `ST1P3HXR80TKT48TKM2VTKCDBS4ET9396W0W2S3K8` |
```bash
stx can_stack 90000000000000 10 mqkccNX5h7Xy1YUku3X2fCFCC54x6HEiHk ST1P3HXR80TKT48TKM2VTKCDBS4ET9396W0W2S3K8 -t
{ eligible: true }
```
If we meet the conditions to participate in Stacking, the command will return true.
## Perform Stacking action
Next, we will perform the Stacking transaction using the `stack` command.
We need the following 4 arguments:
| Parameter | Description | Value |
| --------------- | ------------------------------------------------------------------------------------- | -------------------------------------------------------------------- |
| `Amount` | Amount to stack in microstacks | `90000000000000` |
| `Reward cycles` | Number of reward cycles to lock up your tokens for Stacking | `10` |
| `BTC Address` | BTC address to receive Stacking rewards. This is also referred to as the PoX address. | `mqkccNX5h7Xy1YUku3X2fCFCC54x6HEiHk` |
| `Private Key` | The private key to the address we're Stacking with, which we generated previously | `dca82f838f6e5a893cffc8efe861196252373a5b8b62c0b55ba3a0a7a28795d301` |
```bash
stx stack 90000000000000 10 mqkccNX5h7Xy1YUku3X2fCFCC54x6HEiHk dca82f838f6e5a893cffc8efe861196252373a5b8b62c0b55ba3a0a7a28795d301 -t
{
txid: '0x2e33ad647a9cedacb718ce247967dc705bc0c878db899fdba5eae2437c6fa1e1',
transaction: 'https://explorer.stacks.co/txid/0x2e33ad647a9cedacb718ce247967dc705bc0c878db899fdba5eae2437c6fa1e1'
}
```
If the commands completes successfully, we'll get back a TXID and a URL to the block explorer showing the transaction status.
Now we should wait until the transaction is confirmed. This usually takes a few minutes.
## Checking Stacking Status
Once the transaction has been confirmed, we can check the Stacking status using the `stacking_status` command:
```bash
stx stacking_status ST1P3HXR80TKT48TKM2VTKCDBS4ET9396W0W2S3K8 -t
{
amount_microstx: '90000000000000',
first_reward_cycle: 25,
lock_period: 10,
unlock_height: 3960,
pox_address: {
version: '00',
hashbytes: '05cf52a44bf3e6829b4f8c221cc675355bf83b7d'
}
}
```
Here we can see how many microstacks are locked and when they will unlock.
Congratulations you have learned how to Stack using the CLI. To integrate Stacking into your app, check out the [Stacking integration guide](/stacks-blockchain/integrate-stacking).

5
src/pages/understand-stacks/stacking.md

@ -51,11 +51,6 @@ The Stacking flow is different for delegation use cases:
- Certain delegation relationships may allow the STX holder to receive the payout directly from the miner (step 5/6)
- The termination of the delegation relationship can either happen automatically based on set expiration rules or by actively revoking delegation rights
If you would like to implement this flow in your own wallet, exchange, or any other application, please have a look at this tutorial:
[@page-reference | inline]
| /understand-stacks/integrate-stacking-delegation
## PoX mining
PoX mining is a modification of Proof-of-Burn (PoB) mining, where instead of sending the committed Bitcoin to a burn address, it's transferred to eligible STX holders that participate in the stacking protocol.

326
src/pages/understand-stacks/stacks-blockchain-api.md

@ -9,7 +9,7 @@ images:
The Stacks 2.0 Blockchain API allows you to query the Stacks 2.0 blockchain and interact with smart contracts. It was built to maintain pageable materialized views of the Stacks 2.0 Blockchain.
~> This API is hosted by Hiro. Using it requires you to trust the hosted server, but provides a faster onboarding experience. You can [run your own API server](#running-an-api-server)
~> The RESTful API is developed by Hiro. Hiro also hosts a public API node for easy onboarding. Using it requires you to trust the hosted server, but provides a faster onboarding experience. You can [run your own API server](https://docs.hiro.so/get-started/running-api-node)
The RESTful JSON API can be used without any authorization. The basepath for the API is:
@ -18,9 +18,7 @@ The RESTful JSON API can be used without any authorization. The basepath for the
https://stacks-node-api.testnet.stacks.co/
```
-> Check out the [API references](https://blockstack.github.io/stacks-blockchain-api/) for more details
The API is comprised of two parts: the Stacks Blockchain API and the Stacks Node RPC API. The Node RPC API is exposed by every running node. Stacks Blockchain API, however, introduces additional functionality (e.g. get all transactions). It also proxies calls directly to Stacks Node RPC API.
-> This documentation only covers endpoints that are exposed on a Stacks node, referred to as the RPC API. For full documentation on the RESTful API, check out the [Hiro's API reference](https://docs.hiro.so/api).
### Stacks Node RPC API
@ -28,253 +26,9 @@ The [stacks-node implementation](https://github.com/blockstack/stacks-blockchain
All `/v2/` routes a proxied to a Blockstack PBC-hosted Stacks Node. For a trustless architecture, you should make these requests to a self-hosted node.
### Stacks Blockchain API
All `/extended/` routes are provided by the Stacks 2.0 Blockchain API directly. They extend the Stacks Node API capabilities to make it easier to integrate with.
## Using the API
Depending on your programming environment, you might need to access the API differently.
The easiest way to start interacting with the API might be through the [Postman Collection](https://app.getpostman.com/run-collection/614feab5c108d292bffa#?env%5BStacks%20Blockchain%20API%5D=W3sia2V5Ijoic3R4X2FkZHJlc3MiLCJ2YWx1ZSI6IlNUMlRKUkhESE1ZQlE0MTdIRkIwQkRYNDMwVFFBNVBYUlg2NDk1RzFWIiwiZW5hYmxlZCI6dHJ1ZX0seyJrZXkiOiJibG9ja19pZCIsInZhbHVlIjoiMHgiLCJlbmFibGVkIjp0cnVlfSx7ImtleSI6Im9mZnNldCIsInZhbHVlIjoiMCIsImVuYWJsZWQiOnRydWV9LHsia2V5IjoibGltaXRfdHgiLCJ2YWx1ZSI6IjIwMCIsImVuYWJsZWQiOnRydWV9LHsia2V5IjoibGltaXRfYmxvY2siLCJ2YWx1ZSI6IjMwIiwiZW5hYmxlZCI6dHJ1ZX0seyJrZXkiOiJ0eF9pZCIsInZhbHVlIjoiMHg1NDA5MGMxNmE3MDJiNzUzYjQzMTE0ZTg4NGJjMTlhODBhNzk2MzhmZDQ0OWE0MGY4MDY4Y2RmMDAzY2RlNmUwIiwiZW5hYmxlZCI6dHJ1ZX0seyJrZXkiOiJjb250cmFjdF9pZCIsInZhbHVlIjoiU1RKVFhFSlBKUFBWRE5BOUIwNTJOU1JSQkdRQ0ZOS1ZTMTc4VkdIMS5oZWxsb193b3JsZFxuIiwiZW5hYmxlZCI6dHJ1ZX0seyJrZXkiOiJidGNfYWRkcmVzcyIsInZhbHVlIjoiYWJjIiwiZW5hYmxlZCI6dHJ1ZX0seyJrZXkiOiJjb250cmFjdF9hZGRyZXNzIiwidmFsdWUiOiJTVEpUWEVKUEpQUFZETkE5QjA1Mk5TUlJCR1FDRk5LVlMxNzhWR0gxIiwiZW5hYmxlZCI6dHJ1ZX0seyJrZXkiOiJjb250cmFjdF9uYW1lIiwidmFsdWUiOiJoZWxsb193b3JsZCIsImVuYWJsZWQiOnRydWV9LHsia2V5IjoiY29udHJhY3RfbWFwIiwidmFsdWUiOiJzdG9yZSIsImVuYWJsZWQiOnRydWV9LHsia2V5IjoiY29udHJhY3RfbWV0aG9kIiwidmFsdWUiOiJnZXQtdmFsdWUiLCJlbmFibGVkIjp0cnVlfV0=) or [cURL](https://curl.haxx.se/).
-> Postman allows you to [generate sample code](https://learning.postman.com/docs/sending-requests/generate-code-snippets/) for API requests for various languages and libraries
## OpenAPI spec
The API was designed using the [OpenAPI specification](https://swagger.io/specification/), making it compatible with a variety of developer tools.
Thanks to this design choice, we were able to generate parts of our Javascript client library purely from the specification file. The client generation is done using the [openapi-generator](https://github.com/OpenAPITools/openapi-generator).
-> The client generator supports a variety of languages and might be helpful if you are looking to integrate the API using a different lanugage than Javascript
## Javascript client library
A generated JS Client is available for consumption of this API. The client library enables typesafe REST and WebSocket communication. [Please review the client documentation for more details](https://blockstack.github.io/stacks-blockchain-api/client/index.html).
The client is made up of three components:
1. Generated HTTP API client
2. Typescript definitions for [Clarity values](https://docs.blockstack.org/write-smart-contracts/values)
3. WebSocket client
The following chapters will demonstrate the usage of each component.
### HTTP API client sample
It is important to note that the JS client requires setting the underlying HTTP request library that will handle HTTP communication. The example below uses the universal fetch API [`cross-fetch`](https://github.com/lquixada/cross-fetch):
```js
import fetch from 'cross-fetch';
import { Configuration, AccountsApi } from '@stacks/blockchain-api-client';
(async () => {
const apiConfig = new Configuration({
fetchApi: fetch,
// for mainnet, replace `testnet` with `mainnet`
basePath: 'https://stacks-node-api.testnet.stacks.co', // defaults to http://localhost:3999
});
// initiate the /accounts API with the basepath and fetch library
const accountsApi = new AccountsApi(apiConfig);
// get transactions for a specific account
const txs = await accountsApi.getAccountTransactions({
principal: 'ST000000000000000000002AMW42H',
});
console.log(txs);
})().catch(console.error);
```
### TypeScript sample
The following sample demonstrate how generated [Typescript models](https://github.com/blockstack/stacks-blockchain-api/tree/master/client/src/generated/models) can be used to ensure typesafety:
```ts
import fetch from 'cross-fetch';
import {
Configuration,
AccountsApi,
AccountsApiInterface,
AddressBalanceResponse,
AddressBalanceResponseStx,
} from '@stacks/blockchain-api-client';
(async () => {
const apiConfig: Configuration = new Configuration({
fetchApi: fetch,
// for mainnet, replace `testnet` with `mainnet`
basePath: 'https://stacks-node-api.testnet.stacks.co', // defaults to http://localhost:3999
});
const principal: string = 'ST000000000000000000002AMW42H';
// initiate the /accounts API with the basepath and fetch library
const accountsApi: AccountsApiInterface = new AccountsApi(apiConfig);
// get balance for a specific account
const balance: AddressBalanceResponse = await accountsApi.getAccountBalance({
principal,
});
// get STX balance details
const stxAmount: AddressBalanceResponseStx = balance.stx;
console.log(stxAmount);
})().catch(console.error);
```
### WebSocket sample
The WebSocket components enabled you to subscribe to specific updates, enabling a near real-time display of updates on transactions and accounts.
```js
import { connectWebSocketClient } from '@stacks/blockchain-api-client';
const client = await connectWebSocketClient('ws://stacks-node-api.blockstack.org/');
const sub = await client.subscribeAddressTransactions(contractCall.txId, event => {
console.log(event);
});
await sub.unsubscribe();
```
## Rate limiting
Rate limiting is only applied to [faucet requests](https://blockstack.github.io/stacks-blockchain-api/#tag/Faucets) and based on the address that tokens are requested for.
### BTC Faucet
The bitcoin faucet is limited to **5 requests per 5 minutes**.
### STX Faucet
The Stacks faucet rate limits depend on the type of request. For stacking requests, a limitation of **1 request per 2 days**. In case of regular Stacks faucet requests, the limits are set to **5 requests per 5 minutes**.
## Pagination
To make API responses more compact, lists returned by the API are paginated. For lists, the response body includes:
- `limit`: the number of list items return per response
- `offset`: the number of elements to skip (starting from 0)
- `total`: the number of all available list items
- `results`: the array of list items (length of array equals the set limit)
Here is a sample response:
```json
{
"limit": 10,
"offset": 0,
"total": 101922,
"results": [{
"tx_id": "0x924e0a688664851f5f96b437fabaec19b7542cfcaaf92a97eae43384cacd83d0",
"nonce": 308,
"fee_rate": "0",
"sender_address": "ST39F7SA0AKH7RB363W3NE2DTHD3P32ZHNX2KE7J9",
"sponsored": false,
"post_condition_mode": "deny",
"post_conditions": [],
"anchor_mode": "on_chain_only",
"block_hash": "0x17ceb3da5f36aab351d6b14f5aa77f85bb6b800b954b2f24c564579f80116d99",
"parent_block_hash": "0xe0d1e8d216a77526ae2ce40294fc77038798a179a6532bb8980d3c2183f58de6",
"block_height": 14461,
"burn_block_time": 1622875042,
"burn_block_time_iso": "2021-06-05T06:37:22.000Z",
"canonical": true,
"tx_index": 0,
"tx_status": "success",
"tx_result": { ... },
"microblock_hash": "",
"microblock_sequence": 2147483647,
"microblock_canonical": true,
"event_count": 0,
"events": [],
"tx_type": "coinbase",
"coinbase_payload": { ... }
},
{ ... }
]
}
```
Using the `limit` and `offset` properties, you can paginate through the entire list by increasing the offset by the limit until you reach the total.
## Requesting proofs
Several endpoints will by default request the [MARF Merkel Proof](https://github.com/stacksgov/sips/blob/main/sips/sip-004/sip-004-materialized-view.md#marf-merkle-proofs).
Provided with the proof, a client can verify the value, cumulative energy spent, and the number of confirmation for the response value provided by the API.
Requesting the proof requires more resources (computation time, response time, and response body size). To avoid the additional resources, in case verification is not required, API endpoints allow setting the request parameter: `proof=0`. The returned response object will not have any proof fields.
## Searching
The API provides a search endpoint ([`/extended/v1/search/{id}`](https://blockstack.github.io/stacks-blockchain-api/#operation/search_by_id)) that takes an identifier and responds with matching blocks, transactions, contracts, or accounts.
The search operation used by the endpoint (e.g. `FROM txs WHERE tx_id = $1 LIMIT 1`) matches hashes **equal** to the provided identifier. Fuzzy search, incomplete identifiers, or wildcards will not return any matches.
## Using Clarity values
Some endpoints, like the [read-only function contract call](https://blockstack.github.io/stacks-blockchain-api/#operation/call_read_only_function), require input to as serialized [Clarity value](https://docs.blockstack.org/write-smart-contracts/values). Other endpoints return serialized values that need to be deserialized.
Below is an example for Clarity value usage in combination with the API.
-> The example below is for illustration only. The `@stacks/transactions` library supports typed contract calls and makes [response value utilization much simpler](/write-smart-contracts/values#utilizing-clarity-values-from-transaction-responses)
```ts
import {
Configuration,
SmartContractsApiInterface,
SmartContractsApi,
ReadOnlyFunctionSuccessResponse,
} from '@stacks/blockchain-api-client';
import { uintCV, UIntCV, cvToHex, hexToCV, ClarityType } from '@stacks/transactions';
(async () => {
const apiConfig: Configuration = new Configuration({
fetchApi: fetch,
// for mainnet, replace `testnet` with `mainnet`
basePath: 'https://stacks-node-api.testnet.stacks.co', // defaults to http://localhost:3999
});
const contractsApi: SmartContractsApiInterface = new SmartContractsApi(apiConfig);
const principal: string = 'ST000000000000000000002AMW42H';
// use most recent from: https://stacks-node-api.<mainnet/testnet>.stacks.co/v2/pox
const rewardCycle: UIntCV = uintCV(22);
// call a read-only function
const fnCall: ReadOnlyFunctionSuccessResponse = await contractsApi.callReadOnlyFunction({
contractAddress: principal,
contractName: 'pox',
functionName: 'is-pox-active',
readOnlyFunctionArgs: {
sender: principal,
arguments: [cvToHex(rewardCycle)],
},
});
console.log({
status: fnCall.okay,
result: fnCall.result,
representation: hexToCV(fnCall.result).type === ClarityType.BoolTrue,
});
})().catch(console.error);
```
## Error handling
The API can respond with two different error types:
- For URLs that don't match any defined endpoint, an HTTP 404 is returned. The body lists the URL in reference (as a string)
- For invalid input values (URL/body parameters), an HTTP 500 is returned. The body is a JSON object with an `error` property. The object also includes stack trace (`stack`) and an error UUID (`errorTag`)
## Proxied Stacks Node RPC API endpoints
The Stacks 2.0 Blockchain API is centrally-hosted. However, every running Stacks node exposes an RPC API, which allows you to interact with the underlying blockchain. Instead of using a centrally-hosted API, you can directly access the RPC API of a locally-hosted Node.
The Stacks 2.0 Blockchain API is centrally hosted. However, every running Stacks node exposes an RPC API, which allows you to interact with the underlying blockchain. Instead of using a centrally hosted API, you can directly access the RPC API of a locally hosted Node.
-> The Stacks Blockchain API proxies to Node RPC endpoints
@ -290,77 +44,3 @@ While the Node RPC API doesn't give the same functionality as the hosted Stacks
- [GET /v2/info](https://blockstack.github.io/stacks-blockchain-api/#operation/get_core_api_info)
~> If you run a local node, it exposes an HTTP server on port `20443`. The info endpoint would be `localhost:20443/v2/info`.
## Rosetta support
This API supports [v1.4.6 of the Rosetta specification](https://www.rosetta-api.org/). This industry open standard makes it simple to integrate blockchain deployment and interaction.
-> Find all Data and Construction Rosetta endpoints [here](https://blockstack.github.io/stacks-blockchain-api/#tag/Rosetta)
## Microblocks support
!> API support for microblocks is a work-in-progress. Review the [API documentation][microblocks_api] carefully to
ensure that you are up-to-date on the latest implementation details for microblocks.
The API allows querying the most recently streamed microblocks:
```bash
# for mainnet, remove `.testnet`
curl 'https://stacks-node-api-microblocks.testnet.stacks.co/extended/v1/microblock'
```
```json
{
"limit": 20,
"offset": 0,
"total": 8766,
"results": [
{
"canonical": true,
"microblock_canonical": true,
"microblock_hash": "0xe6897aab881208185e3fb6ba58d9d9e35c43c68f13fbb892b20cebd39ac69567",
"microblock_sequence": 0,
"microblock_parent_hash": "0xe0d1e8d216a77526ae2ce40294fc77038798a179a6532bb8980d3c2183f58de6",
"parent_index_block_hash": "0x178cd9a37bf38f6b85d9f18e65588e60782753b1463ae080fb9865938b0898ea",
"block_height": 14461,
"parent_block_height": 14460,
"parent_block_hash": "0xe0d1e8d216a77526ae2ce40294fc77038798a179a6532bb8980d3c2183f58de6",
"block_hash": "0x17ceb3da5f36aab351d6b14f5aa77f85bb6b800b954b2f24c564579f80116d99",
"txs": ["0x0622e096dec7e2f6e8f7d95f732e04d238b7381aea8d0aecffae026c53e73e05"]
}
]
}
```
## Nonce handling
In order to prevent stuck transactions, you must track the next available nonce for principals issuing transactions. The
API provides an endpoint to make nonce handling simpler:
```bash
# for mainnet, remove `.testnet`
# replace <principal> with your STX address
curl 'https://stacks-node-api-microblocks.testnet.stacks.co/extended/v1/address/<principal>/nonces'
```
```json
{
"last_executed_tx_nonce": 5893,
"last_mempool_tx_nonce": null,
"possible_next_nonce": 5894,
"detected_missing_nonces": []
}
```
You can use the `possible_next_nonce` property as the nonce for your next transaction.
## Running an API server
While Hiro provides a hosted API server of the Stacks Blockchain API, anyone can spin up their own version. Please follow the instructions in this guide to start a Docker container with the API service running:
[@page-reference | inline]
| /understand-stacks/local-development
-> Once started, the API will be available on `localhost:3999`
[microblocks_api]: https://stacks-blockchain-api-git-feat-microblocks-blockstack.vercel.app/#tag/Microblocks

9
src/pages/understand-stacks/technical-specs.md

@ -27,7 +27,7 @@ description: Summary of technical specifications of Stacks 2.0
- Initial mining bonus: This is a special case of the above to incentivize early miners. Coinbase for all burnchain blocks between the first burn block height (to be chosen by independent miners as part of the Stacks 2.0 launch) and the first sortition winner accumulate and are distributed to miners over a fixed window (to be determined). For instance, say burn block height is 10,000 and first sortition is at block 10500 and distribution window is 100 blocks, then coinbase for the first 500 blocks (10,500 - 10,000) will be distributed evenly to miners who win sortition over the subsequent 100 blocks.
- Reward maturity window: 100 blocks, meaning leaders will earn the coinbase reward 100 blocks after the block they successfully mine.
- Block interval: Stacks blockchain produces blocks at the same rate as the underlying burnchain. For Bitcoin, this is approximately every 10 minutes.
- BTC commitment: Miners must commit atleast 11,000 satoshis (5,500 sats / [UTXO output](https://learnmeabitcoin.com/technical/utxo)); 2 outputs / block) to avoid "dust".
- BTC commitment: Miners must commit at least 11,000 satoshis (5,500 sats / [UTXO output](https://learnmeabitcoin.com/technical/utxo)); 2 outputs / block) to avoid "dust."
- For more details, see [Mining](/understand-stacks/mining).
## Stacking
@ -55,7 +55,7 @@ description: Summary of technical specifications of Stacks 2.0
## Transactions
- [Transaction types](/understand-stacks/transactions#types): coinbase, token-transfer, contract-deploy, contract-call, poison-microblock.
- Transaction types: coinbase, token-transfer, contract-deploy, contract-call, poison-microblock.
- Only standard accounts (not contracts) can pay transaction fees.
- Transaction execution is governed by 3 accounts (may or may not be distinct)
1. _originating account_ is the account that creates, _authorizes_ and sends the transaction
@ -66,8 +66,7 @@ description: Summary of technical specifications of Stacks 2.0
- For sponsored authorization, first a user signs with the originating account and then a sponsor signs with the paying account.
- Mempool limit for concurrent pending transactions is 25 per account
- Pending mempool transactions will be garbage-collected [256 blocks after receipt](https://github.com/blockstack/stacks-blockchain/blob/master/src/core/mempool.rs#L62). With 10 minutes target block time, this would equal ~42 hours
- [Learn more about transaction encoding in SIP-005](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-encoding) and [and in our encoding documentation](/understand-stacks/transactions#encoding)
- [Transaction signing and verification are described in SIP-005](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-signing-and-verifying) and [in our documentation](/understand-stacks/transactions#signature-and-verification)
- [Learn more about transaction encoding in SIP-005](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-encoding)
- [Transaction signing and verification are described in SIP-005](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-signing-and-verifying)
- All transactions impacting account balance are atomic, a transfer operation can not increment one account’s balance without decrementing another’s. However, transactions that perform multiple account actions (for example, transferring from multiple accounts) may partially complete.
- Transactions can include a memo string (max 34 bytes)
- Further reading: [Transactions](/understand-stacks/transactions)

9
src/pages/understand-stacks/testnet.md

@ -11,13 +11,6 @@ images:
The testnet is a separate blockchain from the Stacks mainnet analogous to a staging environnment. It's a network used by developers to test their apps, smart contracts, or changes to the protocol in a production-like environment.
It produces blocks at roughly the same rate as mainnet; about 1 block every 10 minutes on average. The Stacks testnet is rarely reset.
## Testnet nodes
If you would like to run your own testnet node, please follow these steps:
[@page-reference | inline]
| /understand-stacks/running-testnet-node
## Testnet API
The hosted [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api) for the testnet is available at this base URL:
@ -30,6 +23,6 @@ https://stacks-node-api.testnet.stacks.co/
The testnet faucet provides you with free Stacks Token (STX) to test with. These are not the same as STX on mainnet and have no value. You can get STX from the faucet on the [Stacks Explorer Sandbox](https://explorer.stacks.co/sandbox/faucet?chain=testnet), or using the [API](https://blockstack.github.io/stacks-blockchain-api/#tag/Faucets).
To get STX tokens from within the Explorer Sandbox, navigate to the "Faucet" tab and click on "Request STX" button. If you would like to get enough STX tokens to try out [Stacking](/understand-stacks/stacking), you should click on "I want to stack".
To get STX tokens from within the Explorer Sandbox, navigate to the "Faucet" tab and click "Request STX" button. If you would like to get enough STX tokens to try out [Stacking](/understand-stacks/stacking), and click "I want to stack."
> The Explorer Sandbox requires you to login with a Secret Key

564
src/pages/understand-stacks/transactions.md

@ -11,13 +11,6 @@ images:
Transactions are the fundamental unit of execution in the Stacks blockchain. Each transaction is originated from a [Stacks 2.0 account](/understand-stacks/accounts), and is retained in the Stacks blockchain history for eternity. This guide helps you understand Stacks 2.0 transactions.
If you want to jump right in and broadcast your first transaction, try this tutorial:
[@page-reference | inline]
| /understand-stacks/sending-tokens
-> The information on this page is based on a design proposal. You can find more conceptual details in this document: [SIP 005: Blocks, Transaction, Accounts](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md).
## Lifecycle
Transactions go through phases before being finally confirmed, and available for all, on the Stacks 2.0 network.
@ -45,563 +38,6 @@ The Stacks 2.0 supports a set of different transaction types:
| Contract call | `contract_call` | Contract call for a public, non read-only function |
| Poison Microblock | `poison_microblock` | Punish leaders who intentionally equivocate about the microblocks they package |
-> The current [Naming service](/naming-services/overview) is unrelated to Stacks 2.0 and there is no naming-specific transaction type. A replacement for the functionality will be implemented as a smart contract.
A sample of each transaction type can be found in the [Stacks Blockchain API response definition for transactions](https://blockstack.github.io/stacks-blockchain-api/#operation/get_transaction_by_id).
~> Read-only contract call calls do **not** require transactions. Read more about it in the [network guide](/understand-stacks/network#read-only-function-calls).
## Anchor mode
Transactions can be mined either in an anchor block or in a [microblock](/understand-stacks/microblocks). If microblocks
are selected, the transaction can be confirmed with a lower latency than the anchor block time.
The anchor mode enum has three options:
- `OnChainOnly` The transaction MUST be included in an anchored block
- `OffChainOnly`: The transaction MUST be included in a microblock
- `Any`: The leader can choose where to include the transaction
Here is an example where the transaction must be included in a microblock:
```js
import { AnchorMode, makeSTXTokenTransfer } from '@stacks/transactions';
import { StacksTestnet, StacksMainnet } from '@stacks/network';
const BigNum = require('bn.js');
const txOptions = {
recipient: 'SP3FGQ8Z7JY9BWYZ5WM53E0M9NK7WHJF0691NZ159',
amount: new BigNum(12345),
senderKey: 'b244296d5907de9864c0b0d51f98a13c52890be0404e83f273144cd5b9960eed01',
network: new StacksTestnet(), // for mainnet, use `StacksMainnet()`
anchorMode: AnchorMode.OffChainOnly, // must be included in a microblock
};
const transaction = await makeSTXTokenTransfer(txOptions);
```
## Post-conditions
Transaction post-conditions are a feature meant to limit the damage malicious smart contract developers and smart contract bugs can do in terms of destroying a user's assets. Post-conditions are executed whenever a contract is instantiated or a public method of an existing contract is executed. Whenever a post-condition fails, a transaction will be forced to abort.
Post-conditions are meant to be added by the user (or by the user's wallet software) at the moment they sign a transaction. For example, a user may append a post-condition saying that upon successful execution, their account's Stacks (STX) balance should have decreased by no more than 1 STX. If this is not the case, then the transaction would abort and the account would only pay the transaction fee of processing it.
### Attributes
Each transaction includes a field that describes zero or more post-conditions that must all be true when the transaction finishes running. The post-condition describes only properties of the owner of the asset before the transaction happend. For a transfer transaction, the post-condition is about the sender, for a burn transaction, the post-condition is about the previous owner. A post-condition includes the following information:
| **Attribute** | **Sample** | **Description** |
| ---------------------------------------------- | ------------------------------------------- | ------------------------------------------------------------------------------------------------ |
| [Principal](/write-smart-contracts/principals) | `SP2ZD731ANQZT6J4K3F5N8A40ZXWXC1XFXHVVQFKE` | original owner of the asset, can be a Stacks address or a contract |
| Asset id | `STX` | Asset to apply conditions to (could be Stacks, fungible, or non-fungible tokens) |
| Comparator | `>=` | Compare operation to be applied (could define "how much" or "whether or not the asset is owned") |
| Literal | `1000000` | Integer or boolean value used to compare instances of the asset against via the condition |
### Evaluation modes
The Stacks blockchain supports an `allow` or `deny` mode for evaluating post-conditions:
- Allow: other asset transfers not covered by the post-conditions are permitted
- Deny: no other asset transfers are permitted besides those named in the post-conditions
## Authorization
Transactions can be authorized in two ways: _standard_ and _sponsored_. The authorization determines whether or not the originating account is also the paying account. In a transaction with a standard authorization, the origin and paying accounts are the same. In a transaction with a sponsored authorization, the origin and paying accounts are distinct, and both accounts must sign the transaction for it to be valid (first the origin, then the spender).
**Sponsored transactions** enable developers and/or infrastructure operators to pay for users to call into their smart contracts, even if users do not have the Stacks (STX) to do so.
The signing flow for sponsored transactions would be to have the user first sign the transaction with their origin account with the intent of it being sponsored (that is, the user must explicitly allow a sponsor to sign), and then have the sponsor sign with their paying account to pay for the user's transaction fee.
## Encoding
A transaction includes the following information. Multiple-byte fields are encoded as big-endian.
| **Type** | **Description** |
| --------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Version number | Network version. `0x80` for testnet, `0x0` for mainnet |
| Chain ID | Chain instance ID. `0x80000000` for testnet, `0x00000001` for mainnet |
| Authorization | Type of authorization (`0x04` for standard, `0x05` for sponsored) and [spending conditions](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-authorization) |
| Post-conditions | List of post-conditions, each including a [type ID and variable-length condition body](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-post-conditions-1) |
| Payload | Transaction type and variable-length [payload](https://github.com/stacksgov/sips/blob/main/sips/sip-005/sip-005-blocks-and-transactions.md#transaction-payloads-1) |
## Construction
The easiest way to construct well-formed transactions is by [using the Stacks Transactions JS library](https://github.com/blockstack/stacks.js/tree/master/packages/transactions#post-conditions). You can construct the following transaction types:
- Stacks token transfer
- Smart contract deploy
- Smart contract function call
When constructing transactions, it is required to set the network the transaction is intended for. This can be either mainnet or testnet. At the moment of this writing, the only available option is the [testnet network](/understand-stacks/testnet).
-> Transactions can be constructed and serialized offline. However, it is required to know the nonce and estimated fees ahead of time. Once internet access is available, the transaction can be broadcasted to the network. Keep in mind that the nonce and fee might change during offline activity, making the transaction invalid.
### Stacks Token transfer
```js
import { makeSTXTokenTransfer } from '@stacks/transactions';
import { StacksTestnet, StacksMainnet } from '@stacks/network';
const BigNum = require('bn.js');
const txOptions = {
recipient: 'SP3FGQ8Z7JY9BWYZ5WM53E0M9NK7WHJF0691NZ159',
amount: new BigNum(12345),
senderKey: 'b244296d5907de9864c0b0d51f98a13c52890be0404e83f273144cd5b9960eed01',
network: new StacksTestnet(), // for mainnet, use `StacksMainnet()`
memo: 'test memo',
nonce: new BigNum(0), // set a nonce manually if you don't want builder to fetch from a Stacks node
fee: new BigNum(200), // set a tx fee if you don't want the builder to estimate
};
const transaction = await makeSTXTokenTransfer(txOptions);
```
-> Read more about [nonces](/understand-stacks/network#nonces) in the network guide
### Smart contract deployment
```js
import { makeContractDeploy } from '@stacks/transactions';
import { StacksTestnet, StacksMainnet } from '@stacks/network';
const BigNum = require('bn.js');
const txOptions = {
contractName: 'contract_name',
codeBody: fs.readFileSync('/path/to/contract.clar').toString(),
senderKey: 'b244296d5907de9864c0b0d51f98a13c52890be0404e83f273144cd5b9960eed01',
network: new StacksTestnet(), // for mainnet, use `StacksMainnet()`
};
const transaction = await makeContractDeploy(txOptions);
```
### Smart contract function call
```js
import { makeContractCall, BufferCV } from '@stacks/transactions';
import { StacksTestnet, StacksMainnet } from '@stacks/network';
const BigNum = require('bn.js');
const txOptions = {
contractAddress: 'SPBMRFRPPGCDE3F384WCJPK8PQJGZ8K9QKK7F59X',
contractName: 'contract_name',
functionName: 'contract_function',
functionArgs: [bufferCVFromString('foo')],
senderKey: 'b244296d5907de9864c0b0d51f98a13c52890be0404e83f273144cd5b9960eed01',
// attempt to fetch this contracts interface and validate the provided functionArgs
validateWithAbi: true,
network: new StacksTestnet(), // for mainnet, use `StacksMainnet()`
};
const transaction = await makeContractCall(txOptions);
```
### Clarity value types
Building transactions that call functions in deployed clarity contracts requires you to construct valid Clarity Values to pass to the function as arguments. The [Clarity type system](https://github.com/stacksgov/sips/blob/main/sips/sip-002/sip-002-smart-contract-language.md#clarity-type-system) contains the following types:
| Type | Declaration | Description |
| ---------------- | ------------------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Tuple | `(tuple (key-name-0 key-type-0) ...)` | Typed tuple with named fields |
| List | `(list max-len entry-type)` | List of maximum length max-len, with entries of type entry-type |
| Response | `(response ok-type err-type)` | Object used by public functions to commit their changes or abort. May be returned or used by other functions as well, however, only public functions have the commit/abort behavior |
| Optional | `(optional some-type)` | Option type for objects that can either be (some value) or none |
| Buffer | `(buff max-len)` | Byte buffer with maximum length `max-len` |
| Principal | `principal` | Object representing a principal (whether a contract principal or standard principal) |
| Boolean | `bool` | Boolean value ('true or 'false) |
| Signed Integer | `int` | Signed 128-bit integer |
| Unsigned Integer | `uint` | Unsigned 128-bit integer |
| ASCII String | `(define-data-var my-str (string-ascii 11) "hello world")` | String value encoded in ASCII |
| UTF-8 String | `(define-data-var my-str (string-utf8 7) u"hello \u{1234}")` | String value encoded in UTF-8 |
The Stacks Transactions JS library contains TypeScript types and classes that map to the Clarity types, in order to make it easy to construct well-typed Clarity values in JavaScript. These types all extend the abstract class `ClarityValue`.
Here are samples for Clarity value constructions using this library:
```js
// construct boolean clarity values
const t = trueCV();
const f = falseCV();
// construct optional clarity values
const nothing = noneCV();
const something = someCV(t);
// construct a buffer clarity value from an existing Buffer
const buffer = Buffer.from('foo');
const bufCV = bufferCV(buffer);
// construct signed and unsigned integer clarity values
const i = intCV(-10);
const u = uintCV(10);
// construct principal clarity values
const address = 'SP2JXKMSH007NPYAQHKJPQMAQYAD90NQGTVJVQ02B';
const contractName = 'contract-name';
const spCV = standardPrincipalCV(address);
const cpCV = contractPrincipalCV(address, contractName);
// construct response clarity values
const errCV = responseErrorCV(trueCV());
const okCV = responseOkCV(falseCV());
// construct tuple clarity values
const tupCV = tupleCV({
a: intCV(1),
b: trueCV(),
c: falseCV(),
});
// construct list clarity values
const l = listCV([trueCV(), falseCV()]);
```
If you develop in Typescript, the type checker can help prevent you from creating wrongly typed Clarity values. For example, the following code won't compile since in Clarity lists are homogeneous, meaning they can only contain values of a single type. It is important to include the type variable `BooleanCV` in this example, otherwise the typescript type checker won't know which type the list is of and won't enforce homogeneity.
```js
const l = listCV < BooleanCV > [trueCV(), intCV(1)];
```
### Setting post-conditions
The Stacks Transactions JS library supports the construction of post conditions.
Here is an example of a post condition that ensures the account's balance will only decrease by no more than 1 STX:
```js
const account = 'SP2ZD731ANQZT6J4K3F5N8A40ZXWXC1XFXHVVQFKE';
const comparator = FungibleConditionCode.GreaterEqual;
// assuming the Stacks (STX) balance before the transaction is 12346
const amount = new BigNum(12345);
const standardSTXPostCondition = makeStandardSTXPostCondition(
account,
comparator,
amount
);
const txOptions = {
..., // other transaction options
postConditions: [standardSTXPostCondition]
}
const transaction = await makeContractCall(txOptions);
```
## Serialization
A well-formed transaction construct is encoded in [Recursive Length Prefix ("RLP")](https://eth.wiki/en/fundamentals/rlp). RLP encoding results in a variable-sized byte array.
In order to broadcast transactions to and between nodes on the network, RLP data is represented in hexadecimal string (also called the **raw format**).
To support an API-friendly and human-readable representation, the [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api) converts transactions into a JSON format.
=> [The Stacks Transactions JS library](https://github.com/blockstack/stacks.js) supports serialization of transactions.
### Raw format
Broadcasting transactions directly to the Stacks Blockchain API or Node RPC API requires the transaction to be serialized and in hexadecimal representation.
```js
// to see the raw serialized tx
const serializedTx = transaction.serialize().toString('hex');
console.log(serializedTx);
```
The method above will return the following string:
```bash
8080000000040015c31b8c1c11c515e244b75806bac48d1399c77500000000000000000000000000000000000127e88a68dce8689fc94ff4c186bf8966f8d544c5129ff84d95a2459b5e8e7c39430388f6c8f85cce8c9ce5e6ec1e157116ca4a67d65ab53768b25d5fb5831939030200000000000516df0ba3e79792be7be5e50a370289accfc8c9e03200000000000f424068656c6c6f20776f726c640000000000000000000000000000000000000000000000
```
-> Transaction IDs are generated by hashing the raw transaction with [sha512/256](https://eprint.iacr.org/2010/548.pdf)
### JSON format
When called the Stacks Blockchain API or Node RPC API, transactions returned will be serialized in a JSON format. Here is a token transfer transaction:
```js
{
"tx_id": "0x77cb1bf0804f09ad24b4c494a6c00d5b10bb0afbb94a0d646fa9640eff338e37",
"nonce": 5893,
"fee_rate": "180",
"sender_address": "STB44HYPYAT2BB2QE513NSP81HTMYWBJP02HPGK6",
"sponsored": false,
"post_condition_mode": "deny",
"post_conditions": [],
"anchor_mode": "any",
"block_hash": "0xf1e54a3acd04232f1362c09d5096b095363158348303396ea5fc5092e1d8788f",
"parent_block_hash": "0x3de356eb5afa5d7b781f6a925d31d69d218b772ec995930b4e15d92bd15443f9",
"block_height": 13984,
"burn_block_time": 1622678407,
"burn_block_time_iso": "2021-06-03T00:00:07.000Z",
"canonical": true,
"tx_index": 2,
"tx_status": "success",
"tx_result": {
"hex": "0x0703",
"repr": "(ok true)"
},
"microblock_hash": "",
"microblock_sequence": 2147483647,
"microblock_canonical": true,
"event_count": 1,
"events": [],
"tx_type": "token_transfer",
"token_transfer": {
"recipient_address": "STZ4C5RT4WH4JGRQA5E0ZF5PPSQCVY1WRB6E2CGW",
"amount": "500000000",
"memo": "0x46617563657400000000000000000000000000000000000000000000000000000000"
}
}
```
### Deserializing
Serialized, raw transactions can be deserialized without access to the internet using [the Stacks Transactions JS library](https://github.com/blockstack/stacks.js/tree/master/packages/transactions):
```js
import { BufferReader, deserializeTransaction } from '@stacks/transactions';
// receive raw transaction
const serializedTx = '808000000...';
const bufferReader = new BufferReader(Buffer.from(serializedTx));
const deserializedTx = deserializeTransaction(bufferReader);
// print memo
console.log(deserializedTx.payload.memo.content);
```
## Signature and Verification
Every transaction contains verifiable signatures that certify its authenticity. These signatures are generated by signing the transaction hash with the origin's private key. The Elliptic Curve Digital Signature Algorithm (ECDSA) is used for signing, with the curve set to secp256k1. The internal structure that encapsulates the signature is the spending condition. Spending conditions include several parameters including the public key hash, nonce, fee rate and the recoverable ECDSA signature.
When constructing a transaction using the JS library, you can supply the private key and signing will be completed automatically. If you would like to sign the transaction manually, use the `TransactionSigner` class.
Below are the steps taken to generate the signature internal to the transaction library.
### Signing steps
Step 1: Generate a transaction hash for signing. This is the SHA512/256 digest of the serialized transaction before a signature is added.
Step 2: Append the authorization type, fee amount and nonce to the transaction hash to create the signature hash.
Step 3: Generate the SHA512/256 hash of the resulting string from the previous step.
Step 4: Sign the hash using ECDSA and the origin private key.
Step 5: Add the resulting recoverable ECDSA signature to the transaction spending condition.
### Single signature transaction
As the name implies a single signature transactions contains 1 signature from the origin account that authorizes a token spend or smart contract deploy/execution.
### Multi-signature transaction
For multi-sig accounts, multiple keys must sign the transaction for it to be valid.
### Sponsored transaction
A sponsored transaction is one where a second signer sets and pays the transaction fees. The origin must sign the transaction first before the sponsor signs.
## Broadcast
With a serialized transaction in the [raw format](#raw-format), it can be broadcasted to the network using the [`POST /v2/transactions`](https://blockstack.github.io/stacks-blockchain-api/#operation/post_core_node_transactions) endpoint:
```bash
# for mainnet, replace `testnet` with `mainnet`
curl --location --request POST 'https://stacks-node-api.testnet.stacks.co/v2/transactions' \
--header 'Content-Type: application/octet-stream' \
--data-raw '<tx_raw_format>'
```
The API will respond with a `HTTP 200 - OK` if the transactions was successfully added to the mempool.
There is no explicit time constraint between the construction of a valid signed transaction and when it can be broadcasted. There are, however, some constraint to be aware of. The following reasons can deem a transaction invalid after some period:
- Token transfer: Nonce changed in-between construction and broadcast
- Contract call or deploy: Block height is evaluated (with [`at-block`](/references/language-functions#at-block)) and changed in-between construction and broadcast
## Mempool
Once a transaction has been successfully broadcast to the network, the transaction is added to the mempool of the node
that received the broadcast. From the [Bitcoin wiki][]: "a node's memory pool contains all 0-confirmation transactions
across the entire network that that particular node knows about." So, the set of transactions in the mempool might be
different for each node in the network. For example, when you query the mempool endpoints on
`stacks-node-api.mainnet.stacks.co`, the response reflects the set of unconfirmed transactions known to the nodes that
service that API.
Miners can employ different heuristics and strategies for deciding which transactions to admit into the mempool and
which transactions to include from the mempool when mining a block. Some transactions may be rejected outright (for
example, if there are insufficient funds at an address) while others might be accepted into the mempool, but not mined
into a block indefinitely (for example if fees are too low). Transactions that are admitted in the mempool but not yet
mined are said to be "pending." The current implementation of [stacks-blockchain][] discards pending mempool
transactions after [256 blocks][].
### Best practices
- **Nonce:** it's crucial that transactions use the correct nonce. Using an incorrect nonce makes it less likely that
the transaction is mined in a timely manner. To determine the correct nonce, query the [`accounts`][] endpoint of
the node you intend to broadcast your transaction to. The value of the `nonce` field of the response is the next nonce
that the node expects to consume for that account. Nonce starts at `0`, so the first transaction from an account should
be set to `nonce=0`.
- **Transaction chaining:** even when using the correct nonce, transactions might arrive at a node out-of-order. For
instance, a transaction with `nonce=1` may arrive in the mempool before the `nonce=0` transaction. Stacks nodes admit
such out-of-order transactions in the mempool, but only up to a limit ([25 in the current implementation][]). So, you
should limit and chain of unconfirmed transactions from a single account to less than 25. Making this limit higher has
downsides, discussed in [this issue](https://github.com/blockstack/stacks-blockchain/issues/2384). If you need to send
more than 25 transactions per block, consider using multiple accounts or a smart-contract based approach. See
[this tool](https://www.npmjs.com/package/@stacks/send-many-stx-cli), for example, that allows up to 200 token
transfers in a single transaction.
## Querying
Transactions on the Stacks 2.0 network can be queried using the [Stacks Blockchain API](/understand-stacks/stacks-blockchain-api). The API exposes two interfaces, a RESTful JSON API and a WebSockets API.
For convenience, a Postman Collection was created and published: [![Run in Postman](https://run.pstmn.io/button.svg)](https://app.getpostman.com/run-collection/614feab5c108d292bffa)
-> Note: The API can be easily consumed using a generated [JS client library](https://blockstack.github.io/stacks-blockchain-api/client/index.html). The generator uses an OpenAPI specification and supports other languages and frameworks.
@include "stacks-api-pagination.md"
### Get recent transactions
Recent transactions can be obtained through the [`GET /extended/v1/tx`](https://blockstack.github.io/stacks-blockchain-api/#operation/get_transaction_list) endpoint:
```bash
# for mainnet, replace `testnet` with `mainnet`
curl 'https://stacks-node-api.testnet.stacks.co/extended/v1/tx'
```
Sample response:
```js
{
"limit": 10,
"offset": 0,
"total": 101922,
"results": [
{
"tx_id": "0x5e9f3933e358df6a73fec0d47ce3e1062c20812c129f5294e6f37a8d27c051d9",
"tx_status": "success",
"tx_type": "coinbase",
"fee_rate": "0",
"sender_address": "ST3WCQ6S0DFT7YHF53M8JPKGDS1N1GSSR91677XF1",
"sponsored": false,
"post_condition_mode": "deny",
"block_hash": "0x58412b50266debd0c35b1a20348ad9c0f17e5525fb155a97033256c83c9e2491",
"block_height": 3231,
"burn_block_time": 1594230455,
"canonical": true,
"tx_index": 0,
"coinbase_payload": {
"data": "0x0000000000000000000000000000000000000000000000000000000000000000"
}
}
]
}
```
### Get mempool transactions
Mempool (registered, but not processed) transactions can be obtained using the [`GET /extended/v1/tx/mempool`](https://blockstack.github.io/stacks-blockchain-api/#operation/get_mempool_transaction_list) endpoint:
```bash
# for mainnet, replace `testnet` with `mainnet`
curl 'https://stacks-node-api.testnet.stacks.co/extended/v1/tx/mempool'
```
Sample response:
```js
{
"limit": 96,
"offset": 0,
"total": 5,
"results": [
{
"tx_id": "0xb31df5a363dad31723324cb5e0eefa04d491519fd30827a521cbc830114aa50c",
"tx_status": "pending",
"tx_type": "token_transfer",
"receipt_time": 1598288370,
"receipt_time_iso": "2020-08-24T16:59:30.000Z",
"fee_rate": "180",
"sender_address": "STB44HYPYAT2BB2QE513NSP81HTMYWBJP02HPGK6",
"sponsored": false,
"post_condition_mode": "deny",
"token_transfer": {
"recipient_address": "ST1GY25DM8RZV4X15X07THRZ2C5NMWPGQWKFGV87F",
"amount": "500000",
"memo": "0x46617563657400000000000000000000000000000000000000000000000000000000"
}
}
]
}
```
-> The `memo` field is represented as a hexadecimal string of a byte buffer
#### Filter by type
Recent transactions can be filtered by [transaction type](/understand-stacks/transactions#types) using the `type` query parameter:
```bash
# for mainnet, replace `testnet` with `mainnet`
curl 'https://stacks-node-api.testnet.stacks.co/extended/v1/tx/?type=contract_call'
```
### Get transaction by ID
A specific transaction can be obtained using the [`GET /extended/v1/tx/<tx_id>`](https://blockstack.github.io/stacks-blockchain-api/#operation/get_transaction_by_id) endpoint:
```bash
# for mainnet, replace `testnet` with `mainnet`
curl 'https://stacks-node-api.testnet.stacks.co/extended/v1/tx/<tx_id>'
```
Sample response:
```js
{
"limit": 96,
"offset": 0,
"total": 5,
"results": [
{
"tx_id": "0xb31df5a363dad31723324cb5e0eefa04d491519fd30827a521cbc830114aa50c",
"tx_status": "pending",
"tx_type": "token_transfer",
"receipt_time": 1598288370,
"receipt_time_iso": "2020-08-24T16:59:30.000Z",
"fee_rate": "180",
"sender_address": "STB44HYPYAT2BB2QE513NSP81HTMYWBJP02HPGK6",
"sponsored": false,
"post_condition_mode": "deny",
"token_transfer": {
"recipient_address": "ST1GY25DM8RZV4X15X07THRZ2C5NMWPGQWKFGV87F",
"amount": "500000",
"memo": "0x46617563657400000000000000000000000000000000000000000000000000000000"
}
}
]
}
```
## Garbage Collection
Broadcasted transactions will stay in the mempool for 256 blocks (~42 hours). If a transactions is not confirmed within that time, it will be removed from the mempool.
!> Most transactions stay in the mempool due to nonce issues. If you see a transaction pending for an unusual time, review the nonce of the account and the transaction.
If a transaction is removed from the mempool, the transaction was not processed and no changes were made to the blockchain state.
[bitcoin wiki]: https://en.bitcoin.it/wiki/Vocabulary#Memory_pool
[256 blocks]: https://github.com/blockstack/stacks-blockchain/blob/master/src/core/mempool.rs#L59
[stacks-blockchain]: https://github.com/blockstack/stacks-blockchain
[`accounts`]: /understand-stacks/accounts#get-stacks-stx-balance-and-nonce
[25 in the current implementation]: https://github.com/blockstack/stacks-blockchain/blob/08c4b9d61b48b99475c0197e7e7fea50c7fb0e29/src/core/mempool.rs#L66

300
src/pages/write-smart-contracts/billboard-tutorial.md

@ -1,300 +0,0 @@
---
title: Billboard
description: Learn how to store data on-chain and transfer STX tokens with Clarity
duration: 30 minutes
experience: intermediate
tags:
- tutorial
images:
large: /images/pages/billboard.svg
---
## Introduction
This tutorial demonstrates how to transfer STX tokens and handle errors in Clarity by building a simple on-chain message
store. Additionally, this tutorial provides a simple overview of testing a smart contract. This tutorial builds on
concepts introduced in the [counter tutorial][], and uses [Clarinet][] to develop and test the smart contract.
In this tutorial you will:
- Set up a development environment with Clarinet
- Define codes for error handling
- Add a data storage variable with functions to get and set the variable
- Add a STX transfer function within the variable setter
- Develop a unit test to verify the contract works as expected
The [final code for this tutorial][] is available in the Clarinet repository.
## Prerequisites
For this tutorial, you should have a local installation of [Clarinet][]. Refer to [Installing Clarinet][] for
instructions on how to set up your local environment. You should also have a text editor or IDE to edit the Clarity
smart contract.
For developing the unit test, it's recommended that you have an IDE with Typescript support, such as
[Visual Studio Code][].
If you are using Visual Studio Code, you may want to install the [Clarity Visual Studio Code plugin][].
## Step 1: set up the project
With Clarinet installed locally, open a new terminal window and create a new Clarinet project. Add a smart contract and
an empty test file to the project:
```sh
clarinet new billboard-clarity && cd billboard-clarity
clarinet contract new billboard
```
These commands create the necessary project structure and contracts for completing this tutorial. Remember that at
any point during this tutorial you can use `clarinet check` to check the validity of your Clarity syntax.
## Step 2: create message storage
Open the `contracts/billboard.clar` file in a text editor or IDE. For this tutorial, you'll use the boilerplate comments
to structure your contract for easy readability.
In this step, you'll add a variable to the contract that stores the billboard message, and define a getter function to
read the value of the variable.
Under the `data maps and vars` comment, define the `billboard-message` variable. Remember that you must define the type of
the variable, in this case `string-utf8` to support emojis and extended characters. You must also define the
maximum length of the variable, for this tutorial use the value `500` to allow for a longer message. You must also
define the initial value for the variable.
```clarity
;; data vars
(define-data-var billboard-message (string-utf8 500) u"Hello world!")
```
You also should define a read-only getter function returns the value of the `billboard-message` variable.
```clarity
;; public functions
(define-read-only (get-message)
(var-get billboard-message))
```
These are the required methods for storing and accessing the message on the billboard.
## Step 3: define set message function
Define a method to set the billboard message. Under the public functions, define a `set-message` function. This public
function takes a `string-utf8` with a max length of `500` as the only argument. Note that the type of the argument
matches the type of the `billboard-message` variable. Clarity's type checking ensures that an invalid input to the
function doesn't execute.
```clarity
;; public functions
(define-public (set-message (message (string-utf8 500)))
(ok (var-set billboard-message message))
)
```
The contract is now capable of updating the `billboard-message`.
## Step 4: transfer STX to set message
In this step, you'll modify the `set-message` function to add a cost in STX tokens, that increments by a set amount each
time the message updates.
First, you should define a variable to track the price of updating the billboard. This value is in micro-STX. Under the
`data maps and vars` heading, add a new variable `price` with type `uint` and an initial value of `u100`. The initial
cost to update the billboard is 100 micro-STX or 0.0001 STX.
```clarity
;; data vars
(define-data-var price uint u100)
```
You also should define a read-only getter function returns the value of the `price` variable. Read-only functions in
Clarity are public, and should be grouped with other public functions in the contract.
```clarity
;; public functions
(define-read-only (get-price)
(var-get price)
)
```
It's a best practice to define codes to a descriptive constant for Clarity smart contracts. This makes the code easier
to understand for readers and makes errors reusable across contract methods. Under the `constants` comment, define a STX
transfer error constant. Assign the value `u0` to the constant. There is no standard for error constants in Clarity,
this value is used because it's the first error the contract defines. Error constants should be defined at the top of
the contract, usually preceding data variables.
```clarity
;; error consts
(define-constant ERR_STX_TRANSFER u0)
```
Modify the `set-message` function to transfer the amount of STX represented by the current price of the billboard from
the function caller to the contract wallet address, and then increment the new price. The function is then executed in four steps: transferring STX from the function caller to the contract, updating the `billboard-message` variable, incrementing the
`price` variable, and returning the new price.
The new `set-message` function uses [`let`][] to define local variables for the function. Two variables are declared,
the `cur-price`, which represents the current price of updating the billboard, and the `new-price`, which represents the
incremented price for updating the billboard.
The function then calls the [`stx-transfer?`][] function to transfer the current price of the contract in STX from the
transaction sender to the contract wallet. This syntax can be confusing: the function call uses the `tx-sender`
variable, which is the principal address of the caller of the function. The second argument to [`stx-transfer?`][] uses
the [`as-contract`][] function to change the context's `tx-sender` value to the principal address that deployed the
contract.
The entire [`stx-transfer?`][] function call is wrapped in the [`unwrap!`][] function, to provide protection from
the transfer failing. The [`unwrap!`][] function executes the first argument, in this case the [`stx-transfer?`][]
function. If the execution returns `(ok ...)`, the [`unwrap!`][] function returns the inner value of the `ok`, otherwise
the function returns the second argument and exits the current control-flow, in this case the `ERR_STX_TRANSFER` error
code.
If the token transfer is successful, the function sets the new `billboard-message` and updates the `price` variable to
`new-price`. Finally, the function returns `(ok new-price)`. It's generally a good practice to have public functions
return `ok` when successfully executed.
-> This function should replace the existing `set-message` function defined previously.
```clarity
(define-public (set-message (message (string-utf8 500)))
(let ((cur-price (var-get price))
(new-price (+ cur-price u10)))
;; pay the contract
(unwrap! (stx-transfer? cur-price tx-sender (as-contract tx-sender)) (err ERR_STX_TRANSFER))
;; update the billboard's message
(var-set billboard-message message)
;; update the price
(var-set price new-price)
;; return the updated price
(ok new-price)
)
)
```
At this point, the final contract should look like this:
```clarity
;; error consts
(define-constant ERR_STX_TRANSFER u0)
;; data vars
(define-data-var billboard-message (string-utf8 500) u"Hello World!")
(define-data-var price uint u100)
;; public functions
(define-read-only (get-price)
(var-get price)
)
(define-read-only (get-message)
(var-get billboard-message)
)
(define-public (set-message (message (string-utf8 500)))
(let ((cur-price (var-get price))
(new-price (+ cur-price u10)))
;; pay the contract
(unwrap! (stx-transfer? cur-price tx-sender (as-contract tx-sender)) (err ERR_STX_TRANSFER))
;; update the billboard's message
(var-set billboard-message message)
;; update the price
(var-set price new-price)
;; return the updated price
(ok new-price)
)
)
```
Use `clarinet check` to ensure that your Clarity code is well-formed and error-free.
## Step 5: write a contract test
At this point, the contract functions as intended, and can be deployed to the blockchain. However, it's good practice
to write automated testing to ensure that the contract functions perform in the expected way. Testing can be valuable
when adding complexity or new functions, as working tests can verify that any changes you make didn't fundamentally
alter the way the functions behave.
Open the `tests/billboard_test.ts` file in your IDE. In this step, you will add a single automated test to exercise the
`set-message` and `get-message` functions of the contract.
Using the Clarinet library, define variables to get a wallet address principal from the Clarinet configuration, and the
balance of that address on the chain.
The functional part of the test is defined using the `chain.mineBlock()` function, which simulates the mining of a
block. Within that function, the test makes four contract calls (`Tx.contractCall()`), two calls to `set-message` and
two calls to `get-message`.
Once the simulated block is mined, the test can make assertions about the chain state. This is accomplished using the
`assertEquals()` function and the `expect` function. In this case, the test asserts that the once the simulated block
is mined, the block height is now equal to `2`, and that the number of receipts (contract calls) in the block are
exactly `4`.
The test can then make assertions about the return values of the contract. The test checks that the result of the
transaction calls to `get-message` match the string values that the calls to `set-message` contain. This covers the
capability of both contract functions.
Finally, the test asserts that STX are transferred from the transaction caller wallet, covering the price updating and
token transfer. The test verifies that the addresses of the wallets match the expected addresses, and that the amount
transferred is the expected amount.
```ts
import { Clarinet, Tx, Chain, Account, types } from 'https://deno.land/x/clarinet@v0.12.0/index.ts';
import { assertEquals } from 'https://deno.land/std@0.90.0/testing/asserts.ts';
Clarinet.test({
name: 'A quick demo on how to assert expectations',
async fn(chain: Chain, accounts: Map<string, Account>) {
let wallet_1 = accounts.get('wallet_1')!;
let assetMaps = chain.getAssetsMaps();
const balance = assetMaps.assets['STX'][wallet_1.address];
let block = chain.mineBlock([
Tx.contractCall('billboard', 'set-message', [types.utf8('testing')], wallet_1.address),
Tx.contractCall('billboard', 'get-message', [], wallet_1.address),
Tx.contractCall('billboard', 'set-message', [types.utf8('testing...')], wallet_1.address),
Tx.contractCall('billboard', 'get-message', [], wallet_1.address),
]);
assertEquals(block.receipts.length, 4);
assertEquals(block.height, 2);
block.receipts[1].result.expectUtf8('testing');
block.receipts[3].result.expectUtf8('testing...');
let [event] = block.receipts[0].events;
let { sender, recipient, amount } = event.stx_transfer_event;
sender.expectPrincipal('ST1J4G6RR643BCG8G8SR6M2D9Z9KXT2NJDRK3FBTK');
recipient.expectPrincipal('ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.billboard');
amount.expectInt(100);
assetMaps = chain.getAssetsMaps();
assertEquals(assetMaps.assets['STX'][wallet_1.address], balance - 210);
},
});
```
Try running `clarinet test` to see the output of the unit test.
=> You have now learned how to store and update data on chain with a variable, and how to transfer STX tokens from
a contract caller to a new principal address. Additionally, you have learned how to write a unit test for a simple
Clarity contract using Clarinet.
[counter tutorial]: /write-smart-contracts/counter-tutorial
[clarinet]: /write-smart-contracts/clarinet
[installing clarinet]: /write-smart-contracts/clarinet#installing-clarinet
[visual studio code]: https://code.visualstudio.com/
[final code for this tutorial]: https://github.com/hirosystems/clarinet/tree/master/examples/billboard
[`let`]: /references/language-functions#let
[`stx-transfer?`]: /references/language-functions#stx-transfer
[`as-contract`]: /references/language-functions#as-contract
[`unwrap!`]: /references/language-functions#unwrap
[clarity visual studio code plugin]: https://marketplace.visualstudio.com/items?itemName=HiroSystems.clarity-lsp

242
src/pages/write-smart-contracts/clarinet.md

@ -1,242 +0,0 @@
---
title: Developing with Clarinet
description: Develop smart contracts locally with the Clarinet REPL and testing harness
---
## Introduction
[Clarinet][] is a local Clarity runtime packaged as a command-line application. It's designed to facilitate rapid smart
contract development, testing, and deployment. Clarinet consists of a Clarity REPL and a testing harness, which, when
used together, allow you to rapidly develop and test a Clarity smart contract, without the need to deploy the contract
to a local mocknet or testnet.
The local Clarity REPL is advantageous, because when learning Clarity or when developing a new smart contract, it's
useful to be able to exercise a contract without needing to wait for block times in a live blockchain. Clarinet allows
you to instantly initialize wallets and populate them with tokens, so that you can interactively or programmatically
test the behavior of the smart contract. Blocks are mined instantly, and you can control the number of blocks that
are mined between testing transactions.
Clarinet is a useful tool for developing smart contracts, and should be used as part of a larger development strategy
that involves building and testing the contract locally, deploying the final draft contract to a testnet environment
and testing on a live blockchain, and deploying the final contract to the mainnet.
When developing smart contracts, you may also want to use the [Clarity Visual Studio Code plugin][].
## Installing Clarinet
Clarinet is available in the Homebrew and Winget package managers. Installing from a package manager is the recommended
installation method.
### Installing from Homebrew (MacOS and Linux)
Install Clarinet from Homebrew with the command:
```sh
brew install clarinet
```
### Installing from winget (Windows)
With the winget package manager installed, use the following command:
```sh
winget install clarinet
```
### Installing from a binary release
You can download a release from the [Clarinet repository](https://github.com/hirosystems/clarinet/releases/latest).
Unzip the binary, then copy it to a location that is already in your path, such as `/usr/local/bin`.
```sh
unzip clarinet-linux-x64.zip -d .
chmod +x ./clarinet
mv ./clarinet /usr/local/bin
```
If you are using MacOS, you may get security warnings when trying to run the precompiled binary. You can resolve the
security warning with the command:
```sh
xattr -d com.apple.quarantine /path/to/downloaded/clarinet
```
### Installing from source
Follow the [procedure](https://github.com/hirosystems/clarinet#install-from-source-using-cargo) outlined in the Clarinet
repository to install from source.
## Developing a Clarity smart contract
Once you have installed Clarinet, you can begin a new Clarinet project with the command:
```sh
clarinet new my-project && cd my-project
```
This command creates a new directory and populates it with boilerplate configuration and testing files. The `toml` files
located in the `settings` directory control the Clarinet environment. For example, the `Devnet.toml` file contains
definitions for wallets in the local REPL environment, and their starting balances (in STX).
```toml
...
[accounts.deployer]
mnemonic = "fetch outside black test wash cover just actual execute nice door want airport betray quantum stamp fish act pen trust portion fatigue scissors vague"
balance = 1_000_000
[accounts.wallet_1]
mnemonic = "spoil sock coyote include verify comic jacket gain beauty tank flush victory illness edge reveal shallow plug hobby usual juice harsh pact wreck eight"
balance = 1_000_000
[accounts.wallet_2]
mnemonic = "arrange scale orient half ugly kid bike twin magnet joke hurt fiber ethics super receive version wreck media fluid much abstract reward street alter"
balance = 1_000_000
...
```
You can create a new contract in the project with the command:
```sh
clarinet contract new my-contract
```
This command creates a new `my-contract.clar` file in the `contracts` directory, and a `my-contract_test.ts` in the
`test` directory. Additionally, it adds the contract to the `Clarinet.toml` configuration file.
```toml
[contracts.my-contract]
path = "contracts/my-contract.clar"
depends_on = []
```
At this point, you can begin editing your smart contract in the `contracts` directory. At any point while you are
developing, you can use the command `clarinet check` to check the syntax of your smart contract.
For a more in-depth overview of developing with Clarinet, review this comprehensive walkthrough video.
<br /><iframe width="560" height="315" src="https://www.youtube.com/embed/zERDftjl6k8" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Testing with Clarinet
Clarinet provides several powerful methods to test and interact with your smart contracts. As mentioned in the previous
section, you can always check your Clarity syntax using the `clarinet check` command. This validates any smart contracts
you are currently developing in the active project.
There are two tools in Clarinet you can use to test smart contracts: the [console][], an interactive Clarity REPL, and
the [test harness][], a testing framework written in Typescript.
### Testing with the console
The Clarinet console is an interactive Clarity REPL that runs in-memory. Any contracts configured in the current project
are automatically loaded into memory. Additionally, wallets defined in the `settings/Devnet.toml` file are
initialized with STX tokens for testing purposes. When the console runs, it provides a summary of the deployed
contracts, their public functions, as well as wallet addresses and balances.
```
clarity-repl v0.11.0
Enter "::help" for usage hints.
Connected to a transient in-memory database.
Initialized contracts
+-------------------------------------------------------+-------------------------+
| Contract identifier | Public functions |
+-------------------------------------------------------+-------------------------+
| ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.my-contract | (echo-number (val int)) |
| | (say-hi) |
+-------------------------------------------------------+-------------------------+
Initialized balances
+------------------------------------------------------+---------+
| Address | STX |
+------------------------------------------------------+---------+
| ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE (deployer) | 1000000 |
+------------------------------------------------------+---------+
| ST1J4G6RR643BCG8G8SR6M2D9Z9KXT2NJDRK3FBTK (wallet_1) | 1000000 |
+------------------------------------------------------+---------+
...
```
You can use the `::help` command for valid console commands.
```
>> ::help
::help Display help
::list_functions Display all the native functions available in Clarity
::describe_function <function> Display documentation for a given native function fn-name
::mint_stx <principal> <amount> Mint STX balance for a given principal
::set_tx_sender <principal> Set tx-sender variable to principal
::get_assets_maps Get assets maps for active accounts
::get_costs <expr> Display the cost analysis
::get_contracts Get contracts
::get_block_height Get current block height
::advance_chain_tip <count> Simulate mining of <count> blocks
```
The console commands control the state of the REPL chain, and let you get information about it and advance the chain
tip. Additionally, you can enter Clarity commands into the console and observe the result of the command. The
`::list_functions` console command prints a cheat sheet of Clarity commands. For example, in the example contract,
you could use the REPL to call the `echo-number` function in the contract with the following command:
```
>> (contract-call? .my-contract echo-number 42)
(ok 42)
```
Note that by default commands are always executed as the `deployer` address, which means you can use the shorthand
`.my-contract` without specifying a full address to the contract. If you changed the transaction address with the
`::set_tx_sender` command, you would need to provide the full address to the contract in the contract call
(`ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.my-contract`).
You can refer to the [Clarity language reference][] for a complete overview of all Clarity functions.
### Testing with the test harness
The test harness is a Deno testing library that can simulate the blockchain, exercise functions of the contract, and
make testing assertions about the state of the contract or chain.
You can run any tests configured in the `tests` directory with the command
```sh
clarinet test
```
When you create a new contract, a test suite is automatically created for it. You can populate the test suite with
unit tests as you develop the contract.
An example unit test for the `echo-number` function is provided below:
```ts
...
Clarinet.test({
name: 'the echo-number function returns the input value ok',
async fn(chain: Chain, accounts: Map<string, Account>) {
const testNum = '42';
let deployerWallet = accounts.get('deployer')!;
let block = chain.mineBlock([
Tx.contractCall(
`${deployerWallet.address}.my-contract`,
'echo-number',
[testNum],
deployerWallet.address,
),
]);
assertEquals(block.receipts.length, 1); // assert that the block received a single tx
assertEquals(block.receipts[0].result, `(ok ${testNum})`); // assert that the result of the tx was ok and the input number
assertEquals(block.height, 2); // assert that only a single block was mined
},
});
```
For more information on assertions, review [asserts][] in the Deno standard library. For more information on the
available Clarity calls in Deno, review the [Deno Clarinet library][].
## Additional reading
- [Clarinet README](https://github.com/hirosystems/clarinet#clarinet)
[clarinet]: https://github.com/hirosystems/clarinet
[console]: #testing-with-the-console
[test harness]: #testing-with-the-test-harness
[clarity language reference]: /references/language-functions
[asserts]: https://deno.land/std@0.90.0/testing/asserts.ts
[deno clarinet library]: https://github.com/hirosystems/clarinet/blob/master/deno/index.ts
[clarity visual studio code plugin]: https://marketplace.visualstudio.com/items?itemName=HiroSystems.clarity-lsp

265
src/pages/write-smart-contracts/counter-tutorial.md

@ -1,265 +0,0 @@
---
title: Counter tutorial
description: Learn how to write a simple counter in Clarity.
experience: beginner
duration: 20 minutes
tags:
- tutorial
images:
large: /images/pages/counter-tutorial.svg
sm: /images/pages/counter-tutorial-sm.svg
---
## Introduction
This tutorial introduces variables in Clarity, and demonstrates how to interact with them through a simple incrementing
and decrementing counter. This tutorial builds on concepts introduced in the [hello world tutorial][] and continues to
exercise [Clarinet][] as a local development environment.
In this tutorial you will:
- Create a new Clarinet project
- Add a new Clarity contract to the project
- Populate the contract with a variable and read variable function
- Populate the contract with an increment and a decrement function
- Execute the functions in a local, simulated blockchain
- Optionally, deploy and test the contract on the testnet blockchain
## Prerequisites
For this tutorial, you should have a local installation of Clarinet. Refer to [Installing Clarinet][] for instructions
on how to set up your local environment. You should also have a text editor or IDE to edit the Clarity smart contract.
If you are using Visual Studio Code, you may want to install the [Clarity Visual Studio Code plugin][].
### Optional prerequisites
While this tutorial primarily focuses on local smart contract development, you may wish to deploy your contract to
a live blockchain. For simplicity, contract deployment is performed using the [testnet sandbox][]. If you wish to
complete the optional deployment step, you should have the [Stacks Web Wallet][] installed, and you should request
testnet STX tokens from the [testnet faucet][] on the testnet explorer. Note that requesting testnet STX from the faucet
can take up to 15 minutes, so you may wish to request the tokens before beginning the tutorial.
## Step 1: create a new project
With Clarinet installed locally, open a new terminal window and create a new Clarinet project with the command:
```sh
clarinet new clarity-counter && cd clarity-counter
```
This command creates a new directory for your smart contract project, populated with boilerplate configuration and
testing files. Creating a new project only creates the Clarinet configuration, in the next step you can add a contract
to the project.
## Step 2: create a new contract
From the `clarity-counter` directory, create a new Clarity contract with the command:
```sh
clarinet contract new counter
```
This command adds a new `counter.clar` file in the `contracts` directory, and adds a `counter_test.ts` file to
the `test` directory. This tutorial ignores the test file, but for production contracts, you can create [unit tests][]
using it.
## Step 3: define variables
Open the `contracts/counter.clar` file in a text editor or IDE. Delete the boilerplate comments, for the purpose of
this tutorial they're not necessary.
In this step, you'll add a variable to the contract, and define a read-only function to output the value of that
variable.
Start by defining the variable on the first line
```clarity
;; define counter variable
(define-data-var counter int 0)
```
The [`define-data-var`][] statement initializes a new integer variable named `counter` and sets the initial value to
`0`. It's important to note that all definition statements in Clarity need to be at the top of the file.
The `counter` variable is stored in the data space associated with the smart contract. The variable is persisted and
acts as a global shared state.
To provide access to the `counter` variable from outside the contract that it's defined in, you should declare a
`read-only` function to get the value. Add this function below the variable definition:
```clarity
;; counter getter
(define-read-only (get-counter)
(ok (var-get counter)))
```
The [`var-get`][] statement looks for a variable in the contract's data space and returns it.
Your contract code should now look like this:
```clarity
;; define counter variable
(define-data-var counter int 0)
;; counter getter
(define-read-only (get-counter)
(ok (var-get counter)))
```
At this point, you can check your contract code to ensure the syntax is correct. In the `clarity-counter` directory in
your terminal, use the command:
```sh
clarinet check
```
If there are no errors, the command returns no output. If there are errors, verify that your contract is exactly as
listed in the preceding section. You can also use the [`clarinet console`][] to interact with the `get-counter`
function:
```clarity
(contract-call? .counter get-counter)
```
The console should return `(ok 0)`.
-> Changes to your contract will not be loaded into the Clarinet console until it is restarted. Close the console with
`Ctrl + C` before proceding to the next step.
## Step 4: define counter functions
In this step, you'll add functions to increment and decrement the counter variable. Add the `increment` function to
the contract after the counter getter:
```clarity
;; increment method
(define-public (increment)
(begin
(var-set counter (+ (var-get counter) 1))
(ok (var-get counter))))
```
The [`begin`][] statement evaluates multiple expressions and returns the value of the last on. In this case, it
evaluates an expression to set a new value for the `counter` variable, and then returns the new value.
The first expression in the `begin` statement is the [`var-set`][] expression, which sets a new value for the counter
variable. The new value is constructed using the [`+`] (add) statement. This statement takes a number of integer
arguments and returns the sum of the integers. Along with add, Clarity provides statements to subtract, multiply, and
divide integers. Find more details in the [Clarity language reference][].
Next, implement a new public function `decrement` to subtract `1` from the counter variable:
```clarity
;; decrement method
(define-public (decrement)
(begin
(var-set counter (- (var-get counter) 1))
(ok (var-get counter))))
```
At this point the contract is complete. The full `counter.clar` file should look like this:
```clarity
;; define counter variable
(define-data-var counter int 0)
;; counter getter
(define-read-only (get-counter)
(ok (var-get counter)))
;; increment method
(define-public (increment)
(begin
(var-set counter (+ (var-get counter) 1))
(ok (var-get counter))))
;; decrement method
(define-public (decrement)
(begin
(var-set counter (- (var-get counter) 1))
(ok (var-get counter))))
```
## Step 5: interact with the contract on the Clarinet console
Run `clarinet check` again to verify that the syntax in your contract is correct. If there are no errors, the command
returns no output. Use the following command to launch the local console:
```sh
clarinet console
```
You'll use the console to interact with the functions in your contract. Call the `increment` function to increment the
counter in the console:
```clarity
(contract-call? .counter increment)
```
The console should return `(ok 1)`. Try calling the `decrement` function to decrement the counter back to 0.
```clarity
(contract-call? .counter decrement)
```
-> You have now learned the basics of working with variables in Clarity, and developed further practice with the
Clarinet development tool. You may wish to optionally deploy the contract to the testnet, described in the next and
final step.
## Optional: deploy and test the contract on the testnet
For this tutorial, you'll use the [testnet sandbox][] to deploy your smart contract. Make sure you have connected your
[Stacks web wallet][] to the sandbox using the **Connect wallet** button, then copy and paste your smart contract into
the Clarity code editor on the **Write & Deploy** page. Edit the contract name or use the randomly generated name
provided to you.
![Counter testnet sandbox](/images/counter-testnet-sandbox.png)
Click **Deploy** to deploy the contract to the blockchain. This will display the Stacks web wallet window with
information about the transaction. Verify that the transaction looks correct, and the network is set to `Testnet`, and
click **Confirm**.
The contract is added to the miners mempool, and included in the next block of the blockchain. This process can take up
to 15 minutes to complete. You can review it on the [transactions][] page of the explorer or in the activity field
of your web wallet.
When your contract is confirmed, navigate to the [call a contract][] page of the sandbox, and search for your contract.
Enter your wallet address in the top field, you can copy this address by clicking the Stacks web wallet icon and
clicking the **Copy address** button. Enter the contract name in the bottom field, in this case `counter`. Click
**Get Contract** to view the contract.
Click the `increment` function in the function summary, then click **Call Function** to perform the function call in the
sandbox. This will display the Stacks web wallet with information about the transaction. Verify the information, then
click **Confirm** to execute the function call.
The function call is added to the miners mempool, and is executed in the next block of the blockchain. This process
can take up to 15 minutes to complete. You can review it on the [transactions][] page of the explorer or in the
activity field of your web wallet.
When the transaction is complete, you can access the transaction summary page from the activity panel in your web
wallet. The transaction summary page displays the output of the function.
Try calling the other public functions from the [call a contract][] page.
=> You have now learned one method of deploying and interacting with smart contracts on Stacks. You have also learned
the strengths of performing local development without having to wait for block times.
[hello world tutorial]: /write-smart-contracts/hello-world
[clarinet]: /write-smart-contracts/clarinet
[installing clarinet]: /write-smart-contracts/clarinet#installing-clarinet
[`define-data-var`]: /references/language-functions#define-data-var
[testnet sandbox]: https://explorer.stacks.co/sandbox/deploy?chain=testnet
[stacks web wallet]: https://www.hiro.so/wallet/install-web
[testnet faucet]: https://explorer.stacks.co/sandbox/faucet?chain=testnet
[unit tests]: /write-smart-contracts/clarinet#testing-with-clarinet
[`var-get`]: /references/language-functions#var-get
[`clarinet console`]: /write-smart-contracts/clarinet#testing-with-the-console
[`begin`]: /references/language-functions#begin
[`var-set`]: /references/language-functions#var-set
[`+`]: /references/language-functions#-add
[clarity language reference]: /references/language-functions
[transactions]: https://explorer.stacks.co/transactions?chain=testnet
[clarity visual studio code plugin]: https://marketplace.visualstudio.com/items?itemName=HiroSystems.clarity-lsp
[call a contract]: https://explorer.stacks.co/sandbox/contract-call?chain=testnet

156
src/pages/write-smart-contracts/devnet.md

@ -1,156 +0,0 @@
---
title: 'Developing a frontend with DevNet'
description: 'Integrate a frontend using a locally running blockchain with Clarinet DevNet'
---
## Introduction
Once you have reached a point where your Clarity smart contract is functional, you may want to develop a web frontend
against your contract. This can be challenging, as the contract must be deployed to a live blockchain to fully
interact with it from a web application. Clarinet provides an easy method to deploy your contract to a blockchain that
runs locally on your machine that is configurable and controllable. This integration feature is called DevNet.
DevNet allows you to perform frontend development and integration testing without the need to deploy your contract to
public testnet. This is valuable if you are in the early stages of developing a product, or if you are developing a
contract and application in stealth. DevNet uses Docker to launch local instances of Bitcoin, Stacks, Stacks API, Stacks
Explorer, and Bitcoin Explorer, and provides total configuration control over all those instances. Once running, DevNet
automatically deploys your contracts and creates Stacks accounts with pre-defined balances.
The services launched by DevNet represent a full instance of the Stacks blockchain with the Proof of Transfer consensus
mechanism running against a locally running Bitcoin testnet. DevNet allows you to control block times, PoX transactions,
and contract deployments. Because DevNet is running locally, it can be reset or re-configured at any time. This allows
for rapid frontend development without the need to interact with the public blockchain.
## Prerequisites
In order to run DevNet, you must have [Clarinet installed][], and you also should have Docker installed locally. Refer
to the [Docker documentation][] for instructions on installing Docker on your development machine.
## Launching DevNet
Clarinet provides sensible a sensible default configuration for DevNet. If you wish to use the default configuration,
you can launch DevNet from the root of your Clarinet project with the command:
```sh
clarinet integrate
```
Clarinet fetches the appropriate Docker images for the Bitcoin node, Stacks node, Stacks API node, and the Bitcoin
and Stacks Explorers. This can take several minutes on first launch. Once the images are launched, the DevNet interface
is displayed in your terminal window. The contracts in your project are deployed to the DevNet blockchain in the second
block of the chain, so you may need to wait for the third block before launching your frontend development environment.
Review the following sections for information about the DevNet interface and configuration options for DevNet.
## DevNet interface
![DevNet interface](/images/devnet-interface.png)
The DevNet interface is displayed as a terminal GUI and consists of four primary panels: the system log, service status,
mempool summary, and a minimal block explorer.
The system log provides a log of events happening throughout the DevNet stack. You can use this log to monitor the
health of the local blockchain and review any events that occur. For services that provide a web interface, the URL
for the local service is displayed next to the container name. You can connect to these URLs using a web browser to
access the service.
The service status provides a status summary for the Docker containers that make up the DevNet stack. A green icon next
to the container indicates that it is in a healthy state, a yellow icon indicates that the container is booting, and a
red icon indicates that there is a problem with the service.
The mempool summary displays a list of transactions in the mempool. These include historical transactions from the
beginning of the blockchain.
The block explorer has two sub-panels: the block summary and the block transactions. You can use the `Arrow` keys to
select a block within the chain (shown at the top of the block explorer), and the block summary and block transactions
panels display information about that block. The block summary displays the Stacks block height, the Stacks block hash,
the Bitcoin block height of the anchor block, and the PoX cycle number of the block. The block transactions panel
displays all Stacks transactions that were included in the block.
You can access the locally running Stacks Explorer and Bitcoin Explorer from the URLs in the service status window for
more detailed information about the blocks.
You can press `0` in the interface to reset the DevNet. Press `Ctrl` + `C` to stop the DevNet and shut down the
containers.
## Configuring DevNet
By default, DevNet launches a local Stacks 2.0 testnet with a fixed block time of 30 seconds. It runs Docker images
that host a Bitcoin node, a Stacks Node, the Stacks API, the Stacks Explorer, and the Bitcoin Explorer. The default
settings should be adequate for most developers, but you can change many of the settings to customize your
development environment.
DevNet settings are located in the `settings/Devnet.toml` file. The file defines the wallets that are created in the
DevNet blockchain, the Stacks miner configuration, Proof of Transfer activity, and many other options.
### Accounts configuration
By default, Clarinet generates 10 wallets in the DevNet configuration file, a deployer wallet and 9 other accounts.
The accounts are seeded with a configurable balance of STX. Each wallet is defined under the heading
`[accounts.wallet_name]` in the TOML configuration file. Each heading has the following options:
- `mnemonic`: the 24-word keyphrase used to generate the wallet address
- `balance`: the balance in micro-STX of the account when the blockchain starts
The private key (`secret_key`), Stacks address, and BTC address are provided as comments under each wallet. These are
useful for configuring [stacking orders][] on DevNet.
### Blockchain configuration
DevNet provides a sensible default configuration for the local blockchain, with a fixed block time of 30 seconds and
the latest development images for each of the Stacks and Bitcoin nodes. These parameters are defined under the
`[devnet]` heading. You can customize these defaults by setting any of the following parameters.
-> Note: if any of the parameters are not supplied in the configuration file, the default value is used.
- `pox_stacking_orders`: defined by [stacking orders][] headings later in the file
- `orchestrator_port`: the port number for the Bitcoin orchestrator service
- `bitcoin_node_p2p_port`: the port number for Bitcoin P2P network traffic
- `bitcoin_node_rpc_port`: the port number for Bitcoin RPC network traffic
- `bitcoin_node_username`: the username for the Bitcoin node container
- `bitcoin_node_password`: the password for the Bitcoin node container
- `bitcoin_controller_port`: the port number for the Bitcoin controller network traffic
- `bitcoin_controller_block_time`: the fixed block time for the testnet in milliseconds
- `stacks_node_rpc_port`: the port number for Stacks RPC network traffic
- `stacks_node_p2p_port`: the port number for Stacks P2P network traffic
- `stacks_node_events_observers`: a whitelist of addresses for observing Stacks node events
- `stacks_api_port`: the port number for Stacks API network traffic
- `stacks_api_events_port`: the port number for Stacks API events network traffic
- `bitcoin_explorer_port`: the port number for Bitcoin Explorer HTTP traffic
- `stacks_explorer_port`: the port number for Stacks Explorer HTTP traffic
- `miner_mnemonic`: the 24-word keyphrase for the STX miner wallet
- `miner_derivation_path`: the derivation path for the STX miner
- `working_dir`: the local working directory for filesystem storage for the testnet
- `postgres_port`: the port number for the Postgres DB (for running the Stacks API)
- `postgres_username`: the username for the Postgres DB
- `postgres_password`: the password for the Postgres DB
- `postgres_database`: the database name of the Postgres DB
- `bitcoin_node_image_url`: a Docker image path for the Bitcoin node container
- `stacks_node_image_url`: a Docker image path for the Stacks node container
- `stacks_api_image_url`: a Docker image path for the Stacks API node container
- `stacks_explorer_image_url`: a Docker image path for the Stacks Explorer node container
- `bitcoin_explorer_image_url`: a Docker image path for the Bitcoin Explorer node container
- `postgres_image_url`: a Docker image path for the Postgres DB container
- `disable_bitcoin_explorer`: Boolean to set if the Bitcoin Explorer container runs in the DevNet stack
- `disable_stacks_explorer`: Boolean to set if the Stacks Explorer container runs in the DevNet stack
- `disable_stacks_api`: Boolean to set if the Stacks API container runs in the DevNet stack
### Stacking orders
You can configure any of the wallets in the DevNet to participate in stacking to exercise the PoX contract
within DevNet. This can be useful if you are developing a contract that interacts with the PoX contract and you need
to set specific test conditions.
Each stacking order is defined under the heading `[[devnet.pox_stacking_orders]]`. This heading is repeated for as many
stacking orders that are necessary for your configuration.
- `start_at_cycle`: the stacking cycle that the wallet should start particiating in. The wallet's stacking order
occurs at the block preceding the beginning of that cycle.
- `duration`: the stacking duration for the stacking cycle
- `wallet`: the alias of the wallet participating
- `slots`: the number of stacking slots that the wallet will participate in
- `btc_address`: the BTC address that stacking rewards should be sent to
[clarinet installed]: /write-smart-contracts/clarinet#installing-clarinet
[docker documentation]: https://docs.docker.com/get-docker/
[stacking orders]: #stacking-orders

269
src/pages/write-smart-contracts/hello-world-tutorial.md

@ -1,269 +0,0 @@
---
title: Hello, World
description: Learn the basics of Clarity and write a simple Hello World smart contract.
duration: 15 minutes
experience: beginner
tags:
- tutorial
images:
large: /images/pages/hello-world.svg
sm: /images/pages/hello-world-sm.svg
---
## Introduction
In the world of smart contracts, everything is a blockchain transaction. You use tokens in your wallet to deploy a
smart contract in a transaction, and each call to that contract after it's published is also a transaction. Because
block times can affect how quickly a function is executed and returned, it's advantageous to perform local development
and testing of smart contracts with a simulated blockchain, so that functions execute immediately. This tutorial
introduces you to local smart contract development with [Clarinet][], a development tool for building and testing
Clarity smart contracts.
Clarity, the smart contract language used on the Stacks Blockchain, is a LISP-based language and uses its
parenthesized notation. Clarity is an [interpreted language](https://en.wikipedia.org/wiki/Interpreted_language), and
[decidable](https://en.wikipedia.org/wiki/Recursive_language). To learn more basics about the language, see the
[Introduction to Clarity](/write-smart-contracts/overview) topic.
In this tutorial you will:
- Create a new Clarinet project
- Add a new Clarity contract to the project
- Populate the contract with 2 types of functions
- Execute the functions in a local, simulated blockchain
- Optionally, deploy and test the contract on the testnet blockchain
## Prerequisites
For this tutorial, you should have a local installation of Clarinet. Refer to [Installing Clarinet][] for instructions
on how to set up your local environment. You should also have a text editor or IDE to edit the Clarity smart contract.
Note that you could also complete the coding portion of this tutorial in an online REPL such as [clarity.tools][]. If
you are using the online REPL, you can skip to [step 3][] of the tutorial and enter the code into the sandbox.
If you are using Visual Studio Code, you may want to install the [Clarity Visual Studio Code plugin][].
### Optional prerequisites
While this tutorial primarily focuses on local smart contract development, you may wish to deploy your contract to
a live blockchain. For simplicity, contract deployment is performed using the [testnet sandbox][]. If you wish to
complete the optional deployment step, you should have the [Stacks Web Wallet][] installed, and you should request
testnet STX tokens from the [testnet faucet][] on the testnet explorer. Note that requesting testnet STX from the faucet
can take up to 15 minutes, so you may wish to request the tokens before beginning the tutorial.
## Step 1: create a new project
With Clarinet installed locally, open a new terminal window and create a new Clarinet project with the command:
```sh
clarinet new clarity-hello-world && cd clarity-hello-world
```
This command creates a new directory for your smart contract project, populated with boilerplate configuration and
testing files. Creating a new project only creates the Clarinet configuration, in the next step you can add a contract
to the project.
## Step 2: create a new contract
From the `clarity-hello-world` directory, create a new Clarity contract with the command:
```sh
clarinet contract new hello-world
```
This command adds a new `hello-world.clar` file in the `contracts` directory, and adds a `hello-world_test.ts` file to
the `test` directory. This tutorial ignores the test file, but for production contracts, you can create [unit tests][]
using it.
## Step 3: add code to the hello-world contract
Open the `contracts/hello-world.clar` file in a text editor or IDE. Delete the boilerplate comments, for the purpose of
this tutorial they're not necessary.
For this tutorial, you'll add two Clarity functions to the contract. Clarity functions are fully enclosed in
parentheses, and whitespace doesn't matter.
The first function is a public function called `say-hi`.
```clarity
(define-public (say-hi)
(ok "hello world"))
```
Public functions in Clarity are callable from other smart contracts, which enables you to break complex tasks into
smaller, simpler smart contraxcts (an exercise in [separating concerns][]).
-> To create private functions, you would use the `define-private` keyword. Private functions can only be called from
within the smart contract they're declared in. External contracts can only call public functions.
The function doesn't take any parameters and simply returns "hello world" using the [`ok`][] response constructor.
The second function is a [read-only function][] called `echo-number`.
```clarity
(define-read-only (echo-number (val int))
(ok val))
```
Read-only functions are also public functions, but as the name implies, they can't change any variables or datamaps.
`echo-number` takes an input parameter of type `int` and uses an [`ok`][] response to return the value passed to the
function.
-> Clarity supports a variety of other [types](/references/language-types)
The full `contracts/hello-world.clar` file should look like this:
```clarity
(define-public (say-hi)
(ok "hello world"))
(define-read-only (echo-number (val int))
(ok val))
```
In the following steps you can interact with this contract in the local console. You can optionally deploy this contract
to the testnet and interact with it on a live blockchain.
## Step 4: interact with the contract in the Clarinet console
In the `clarity-hello-world` directory in your terminal, use the following command to verify that the syntax in
your contract is correct:
```sh
clarinet check
```
If there are no errors, the command returns no output. If there are errors, verify that your
contract is exactly as listed in the preceding section.
In the same directory, use the following command to launch the local console:
```sh
clarinet console
```
This console is a Clarinet read-eval-print loop (REPL) that executes Clarity code instantly when a function is called.
When the Clarinet console is invoked, it provides a summary of the available contracts and the simulated wallets in
memory:
```sh
clarity-repl v0.11.1
Enter "::help" for usage hints.
Connected to a transient in-memory database.
Contracts
+-------------------------------------------------------+-------------------------+
| Contract identifier | Public functions |
+-------------------------------------------------------+-------------------------+
| ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.hello-world | (echo-number (val int)) |
| | (say-hi) |
+-------------------------------------------------------+-------------------------+
Initialized balances
+------------------------------------------------------+---------+
| Address | STX |
+------------------------------------------------------+---------+
| ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE (deployer) | 1000000 |
+------------------------------------------------------+---------+
| ST1J4G6RR643BCG8G8SR6M2D9Z9KXT2NJDRK3FBTK (wallet_1) | 1000000 |
+------------------------------------------------------+---------+
| ST20ATRN26N9P05V2F1RHFRV24X8C8M3W54E427B2 (wallet_2) | 1000000 |
+------------------------------------------------------+---------+
| ST21HMSJATHZ888PD0S0SSTWP4J61TCRJYEVQ0STB (wallet_3) | 1000000 |
+------------------------------------------------------+---------+
| ST2QXSK64YQX3CQPC530K79XWQ98XFAM9W3XKEH3N (wallet_4) | 1000000 |
+------------------------------------------------------+---------+
| ST3DG3R65C9TTEEW5BC5XTSY0M1JM7NBE7GVWKTVJ (wallet_5) | 1000000 |
+------------------------------------------------------+---------+
| ST3R3B1WVY7RK5D3SV5YTH01XSX1S4NN5B3QK2X0W (wallet_6) | 1000000 |
+------------------------------------------------------+---------+
| ST3ZG8F9X4VKVTVQB2APF4NEYEE1HQHC2EDBF09JN (wallet_7) | 1000000 |
+------------------------------------------------------+---------+
| STEB8ZW46YZJ40E3P7A287RBJFWPHYNQ2AB5ECT8 (wallet_8) | 1000000 |
+------------------------------------------------------+---------+
| STFCVYY1RJDNJHST7RRTPACYHVJQDJ7R1DWTQHQA (wallet_9) | 1000000 |
+------------------------------------------------------+---------+
```
The console provides the ability to interact with your contract using Clarity commands. Call the `say-hi` function
with the following command:
```clarity
(contract-call? .hello-world say-hi)
```
The console immediately returns `(ok "hello world")`, the expected return value of the function.
Next, call the `echo-number` function:
```clarity
(contract-call? .hello-world echo-number 42)
```
The console immediately returns `(ok 42)`, the expected return value of the function with the parameters you called it
with.
Try calling the `echo-number` function with an incorrect type, in this case an unsigned integer:
```clarity
(contract-call? .hello-world echo-number u42)
```
The console should return `Analysis error: expecting expression of type 'int', found 'uint'`, indicating that the call
to the contract was invalid due to the incorrect type.
=> You have now learned the basics of Clarity and working with the Clarinet development tool. You may
wish to optionally deploy the contract to the testnet, described in the next and final step.
## Optional: deploy and test the contract on the testnet
For this tutorial, you'll use the [testnet sandbox][] to deploy your smart contract. Make sure you have connected your
[Stacks web wallet][] to the sandbox using the **Connect wallet** button, then copy and paste your smart contract into
the Clarity code editor on the **Write & Deploy** page. Edit the contract name or use the randomly generated name
provided to you.
![Hello world testnet sandbox](/images/hello-world-testnet-sandbox.png)
Click **Deploy** to deploy the contract to the blockchain. This will display the Stacks web wallet window with
information about the transaction. Verify that the transaction looks correct, and the network is set to `Testnet`, and
click **Confirm**.
The contract is added to the miners mempool, and included in the next block of the blockchain. This process can take up
to 15 minutes to complete. You can review it on the [transactions][] page of the explorer or in the activity field
of your web wallet.
When your contract is confirmed, navigate to the [call a contract][] page of the sandbox, and search for your contract.
Enter your wallet address in the top field, you can copy this address by clicking the Stacks web wallet icon and
clicking the **Copy address** button. Enter the contract name in the bottom field, in this case `hello-world`. Click
**Get Contract** to view the contract.
![Hello world sandbox contract](/images/hello-world-sandbox-contract.png)
Click the `say-hi` function in the function summary, then click **Call Function** to perform the function call in the
sandbox. This will display the Stacks web wallet with information about the transaction. Verify the information, then
click **Confirm** to execute the function call.
The function call is added to the miners mempool, and is executed in the next block of the blockchain. This process
can take up to 15 minutes to complete. You can review it on the [transactions][] page of the explorer or in the
activity field of your web wallet.
When the transaction is complete, you can access the transaction summary page from the activity panel in your web
wallet. The transaction summary page displays the output of the function:
![Hello world transaction summary](/images/hello-world-transaction-summary.png)
=> You have now learned one method of deploying and interacting with smart contracts on Stacks. You have also learned
the strengths of performing local development without having to wait for block times.
[clarinet]: /write-smart-contracts/clarinet
[installing clarinet]: /write-smart-contracts/clarinet#installing-clarinet
[clarity.tools]: https://clarity.tools
[testnet sandbox]: https://explorer.stacks.co/sandbox/deploy?chain=testnet
[stacks web wallet]: https://www.hiro.so/wallet/install-web
[testnet faucet]: https://explorer.stacks.co/sandbox/faucet?chain=testnet
[step 3]: #step-3-add-code-to-the-hello-world-contract
[unit tests]: /write-smart-contracts/clarinet#testing-with-clarinet
[separating concerns]: https://en.wikipedia.org/wiki/Separation_of_concerns
[`ok`]: /references/language-functions#ok
[read-only function]: /references/language-functions#define-read-only
[transactions]: https://explorer.stacks.co/transactions?chain=testnet
[call a contract]: https://explorer.stacks.co/sandbox/contract-call?chain=testnet
[clarity visual studio code plugin]: https://marketplace.visualstudio.com/items?itemName=HiroSystems.clarity-lsp

24
src/pages/write-smart-contracts/install-source.md

@ -1,24 +0,0 @@
---
title: Install Clarity from source
description: 'Stacks smart contracting language'
---
## Installation
Build using `rust` and `cargo`:
```bash
cargo build --release
```
Install globally (you may have to run as sudoer):
```bash
cargo install --path .
```
You should now be able to run the command:
```bash
blockstack-core
```

291
src/pages/write-smart-contracts/nft-tutorial.md

@ -1,291 +0,0 @@
---
title: NFT tutorial
description: Build your own NFT on Bitcoin
duration: 15 minutes
experience: intermediate
tags:
- tutorial
icon: TestnetIcon
images:
large: /images/pages/nft/nft.png
sm: /images/pages/nft/nft.png
---
![What you'll build in this tutorial](/images/pages/nft/nft-preview.png)
## Introduction
Non-fungible tokens, or NFTs, are a type of [token](/write-smart-contracts/tokens#non-fungible-tokens-nfts) that can
represent unique data. NFTs are an emerging technology in blockchain, and there are many different potential uses for
them. NFTs have desirable [characteristics](/write-smart-contracts/tokens) like uniqueness, programmability, and
verifiable ownership. Simply put, an NFT is a piece of information that's unique. A common example of an NFT might be a
piece of digital art.
Clarity offers native support for token creation and management. On top of that, the Stacks ecosystem has adopted a
[standard for NFTs](https://github.com/stacksgov/sips/blob/main/sips/sip-009/sip-009-nft-standard.md). With these two
resources, creating your own NFT on Stacks is easy.
In this tutorial you will:
- Create a new Clarinet project
- Add contracts to the project, and set dependencies for those contracts
- Define an NFT contract based on the [SIP-009](https://github.com/stacksgov/sips/blob/main/sips/sip-009/sip-009-nft-standard.md) standard
- Verify the contract using Clarinet
- Optionally, deploy and test the contract on the testnet blockchain
## Prerequisites
For this tutorial, you should have a local installation of Clarinet. Refer to [Installing Clarinet](/write-smart-contracts/clarinet#installing-clarinet)
for instructions on how to set up your local environment. You should also have a text editor or IDE to edit the Clarity
smart contracts.
If you are using Visual Studio Code, you may want to install the [Clarity Visual Studio Code plugin](https://marketplace.visualstudio.com/items?itemName=HiroSystems.clarity-lsp).
### Optional prerequisites
While this tutorial primarily focuses on local smart contract development, you may wish to deploy your contract to a
live blockchain. For simplicity, contract deployment is performed using the [testnet sandbox](https://explorer.stacks.co/sandbox/deploy?chain=testnet).
If you wish to complete the optional deployment step, you should have the [Stacks Web Wallet](https://www.hiro.so/wallet/install-web)
installed, and you should request testnet STX tokens from the [testnet faucet](https://explorer.stacks.co/sandbox/faucet?chain=testnet)
on the testnet explorer. Note that requesting testnet STX from the faucet can take up to 15 minuets, so you may wish to
request the tokens before beginning the tutorial.
![faucet](/images/pages/nft/faucet.png)
## Step 1: Create a new project
With [Clarinet installed locally](/write-smart-contracts/clarinet#installing-clarinet), open a terminal window and
create a new Clarinet project with the command:
```sh
clarinet new clarity-nft && cd clarity-nft
```
This command creates a new directory for your smart contract project, populated with boilerplate configuration and
testing files. Creating a new project only creates the Clarinet configuration, in the next stel you can add contracts
to the project.
## Step 2: Add contracts to the project
Because NFTs rely on the traits defined in [SIP-009](https://github.com/stacksgov/sips/blob/main/sips/sip-009/sip-009-nft-standard.md),
the project should have two contracts: one that defines the traits, and the other to define your specific NFT. The NFT
contract is dependent on the contract that defines the traits.
From the `clarity-nft` directory, create two new Clarity contracts with the commands:
```sh
clarinet contract new nft-trait; clarinet contract new my-nft
```
These commands add four new files: a `nft-trait.clar` and `my-nft.clar` file in the `contracts` director, and
corresponding test files in the `tests` directory.
Remove the `nft-trait_test.ts` file from the `tests` directory, as it's not necessary.
```sh
rm tests/nft-trait_test.ts
```
-> Remember that at any point in this tutorial, you can run `clarinet check` to check the validity of your contract.
## Step 3: Configure dependencies and define traits
The NFT standard, [SIP-009](https://github.com/stacksgov/sips/blob/main/sips/sip-009/sip-009-nft-standard.md), defines
a set of standard traits that a compliant contract must implement. This is useful to ensure that different tokens are
able to be supported by Stacks wallets without additional development on the wallet. On the live blockchain, a contract
can declare that it conforms to a specific set of traits with the [`impl-trait`](/references/language-functions#impl-trait)
Clarity function. When a contract uses `impl-trait` to assert compliance with a set of standard traits, the contract can
fail deployment to the blockchain if it violates the trait specification.
In the local Clarinet REPL, you must specify the contract dependency in the configuration files. Open `Clarinet.toml`
and edit the `contracts.my-nft` heading to declare the dependency on the `nft-trait` contract.
```toml
[contracts.my-own-nft]
path = "contracts/my-own-nft.clar"
depends_on = ["nft-trait"]
```
Update the `nft-trait.clar` contract to define the required traits for [SIP-009](https://github.com/stacksgov/sips/blob/main/sips/sip-009/sip-009-nft-standard.md#trait).
You can paste the contract from this page, or from [Friedger's repository](https://github.com/friedger/clarity-smart-contracts/blob/master/contracts/sips/nft-trait.clar).
```clarity
(define-trait nft-trait
(
;; Last token ID, limited to uint range
(get-last-token-id () (response uint uint))
;; URI for metadata associated with the token
(get-token-uri (uint) (response (optional (string-ascii 256)) uint))
;; Owner of a given token identifier
(get-owner (uint) (response (optional principal) uint))
;; Transfer from the sender to a new principal
(transfer (uint principal principal) (response bool uint))
)
)
```
## Step 4: Define your personal NFT
For this tutorial, you'll define an NFT contract for the Stacks testnet. Open the `my-nft.clar` file and copy the
following code into the file.
```clarity
;; use the SIP090 interface (testnet)
(impl-trait 'ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.nft-trait.nft-trait)
;; define a new NFT. Make sure to replace MY-OWN-NFT
(define-non-fungible-token MY-OWN-NFT uint)
;; Store the last issues token ID
(define-data-var last-id uint u0)
;; Claim a new NFT
(define-public (claim)
(mint tx-sender))
;; SIP009: Transfer token to a specified principal
(define-public (transfer (token-id uint) (sender principal) (recipient principal))
(if (and
(is-eq tx-sender sender))
;; Make sure to replace MY-OWN-NFT
(match (nft-transfer? MY-OWN-NFT token-id sender recipient)
success (ok success)
error (err error))
(err u500)))
;; SIP009: Get the owner of the specified token ID
(define-read-only (get-owner (token-id uint))
;; Make sure to replace MY-OWN-NFT
(ok (nft-get-owner? MY-OWN-NFT token-id)))
;; SIP009: Get the last token ID
(define-read-only (get-last-token-id)
(ok (var-get last-id)))
;; SIP009: Get the token URI. You can set it to any other URI
(define-read-only (get-token-uri (token-id uint))
(ok (some "https://docs.stacks.co")))
;; Internal - Mint new NFT
(define-private (mint (new-owner principal))
(let ((next-id (+ u1 (var-get last-id))))
;; Make sure to replace MY-OWN-NFT
(match (nft-mint? MY-OWN-NFT next-id new-owner)
success
(begin
(var-set last-id next-id)
(ok true))
error (err error))))
```
Continue editing the file, making sure that you replace the `MY-OWN-NFT` string in the contract with your own string.
When you have finished editing the file, run `clarinet check` in the terminal to check that your Clarity code is valid.
## Step 5: Review contracts and methods in the console
If the Clarity code is valid, you can run `clarinet console` in the terminal to interact with the contract.
```
Contracts
+-----------------------------------------------------+---------------------------------+
| Contract identifier | Public functions |
+-----------------------------------------------------+---------------------------------+
| ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.my-nft | (claim) |
| | (get-last-token-id) |
| | (get-owner (token-id uint)) |
| | (get-token-uri (token-id uint)) |
| | (transfer |
| | (token-id uint) |
| | (sender principal) |
| | (recipient principal)) |
+-----------------------------------------------------+---------------------------------+
| ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.nft-trait | |
+-----------------------------------------------------+---------------------------------+
```
Try claiming the NFT by running the command `(contract-call? .my-nft claim)`. You should receive console output similar
to the following:
```
>> (contract-call? .my-nft claim)
Events emitted
{"type":"nft_mint_event","nft_mint_event":{"asset_identifier":"ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE.my-nft::MY-OWN-NFT","recipient":"ST1HTBVD3JG9C05J7HBJTHGR0GGW7KXW28M5JS8QE","value":"u1"}}
(ok true)
```
## Step 6: Add tests
At this point, the contract functions as intended, and can be deployed to the blockchain. However, it is good practice
to write automated testing to ensure that the contract functions always perform in the expected way. When adding
complexity or changing the contract, having pre-written, working tests can help you verify that changes you make don't
alter the way that contract functions behave.
Open the `tests/my-nft_test.ts` file in your IDE. In this step, you will add a single automated test to verify the
`get-last-token-id` and `get-token-uri` functions of the contract.
The test uses the `chain.mineBlock()` function to simulate the mining of a block. Within that simulated block, the test
makes 2 contract calls (`Tx.contractCall()`), one each to each of the contract functions under test.
Once the simulated block is mined, the test can make assertions about the return values of the functions under test. The
test checks that 2 contract calls were made in the block, and that exactly one block was mined. The test then asserts
that the return values of each contract call were `ok`, and that the value wrapped in the `ok` is the expected value.
Replace the contents of the `tests/my-nft_test.ts` file with the following code:
```ts
import { Clarinet, Tx, Chain, Account, types } from 'https://deno.land/x/clarinet@v0.12.0/index.ts';
import { assertEquals } from 'https://deno.land/std@0.90.0/testing/asserts.ts';
Clarinet.test({
name: 'Ensure that NFT token URL and ID is as expected',
async fn(chain: Chain, accounts: Map<string, Account>) {
let wallet_1 = accounts.get('wallet_1')!;
let block = chain.mineBlock([
Tx.contractCall('my-nft', 'get-last-token-id', [], wallet_1.address),
Tx.contractCall('my-nft', 'get-token-uri', [types.uint(1)], wallet_1.address),
]);
assertEquals(block.receipts.length, 2);
assertEquals(block.height, 2);
block.receipts[0].result.expectOk().expectUint(0);
block.receipts[1].result.expectOk().expectSome().expectAscii('https://docs.stacks.co');
},
});
```
Run `clarinet test` in the terminal to review the output of the test.
=> You have now learned how to work with contract traits, and how to unit test a contract with Clarinet. If you would
like to try deploying your contract to the testnet, proceed with the following optional step.
## Optional: Deploy the NFT to the testnet
For this tutorial, you'll use the [testnet sandbox](https://explorer.stacks.co/sandbox/deploy?chain=testnet) to deploy
your smart contract. Make sure you have connected your [Stacks web wallet](https://www.hiro.so/wallet/install-web) to
the sandbox using the **Connect wallet** button, then copy and paste the `my-nft.clar` smart contract into the Clarity
code editor on the [Write & Deploy](https://explorer.stacks.co/sandbox/deploy?chain=testnet) page. Edit the contract name or use the randomly generated name provided to you.
Click **Deploy** to deploy the contract to the blockchain. This will display the Stacks web wallet window with
information about the transaction. Verify that the transaction looks correct, and the network is set to `Testnet`, and
click **Confirm**.
The deployment process can take up to 15 minutes to complete. You can review it on the
[transactions](https://explorer.stacks.co/transactions?chain=testnet) page of the explorer, or in the activity field of
your web wallet.
When your contract is confirmed, navigate to the
[Call a contract](https://explorer.stacks.co/sandbox/contract-call?chain=testnet) page of the sandbox, and search for
your contract. Enter your wallet address in the top field, can you copy this address by clicking the Stacks web wallet
icon and clicking the **Copy address** button. Enter the contract name in the bottom field, in this case `my-nft`. Click
**Get contract** to view the contract.
Click the `claim` function in the function summary, then click **Call function** to perform the function call in the
sandbox. This will display the Stacks web wallet with information about the transaction. Verify the information, then
click **Confirm** to execute the function call. The function call can take up to 15 minutes to complete.
When the transaction is complete, you can access the transaction summary page from the activity panel of your web
wallet. The transaction summary page displays the output of the function. You should also see your personal NFT in your
web wallet.

10
src/pages/write-smart-contracts/overview.md

@ -11,11 +11,6 @@ images:
Clarity is a programming language for writing smart contracts on the Stacks 2.0 blockchain. It supports programmatic
control over digital assets.
Prefer to jump right in? Get started with the Hello World tutorial:
[@page-reference | inline]
| /write-smart-contracts/hello-world-tutorial
## Smart contracts
Smart contracts encode and enforce rules for modifying a particular set of data that is shared among people and entities
@ -66,11 +61,6 @@ Note some of the key Clarity language rules and limitations.
- There is support for lists, however, the only variable length lists in the language appear as function inputs; there is no support for list operations like append or join.
- Variables are immutable.
## Try a tutorial
[@page-reference | grid]
| /write-smart-contracts/hello-world-tutorial, /write-smart-contracts/counter-tutorial, /write-smart-contracts/billboard-tutorial
## Explore more
For language details and references, see the following:

143
src/pages/write-smart-contracts/testing-contracts.md

@ -1,143 +0,0 @@
---
title: Testing Clarity code with JS and Mocha
description: Learn to Test Clarity Contract Code with JavaScript and Mocha.
experience: advanced
duration: 15 minutes
---
## Introduction
Clarity, the smart contracting language, is based on [LISP](<https://en.wikipedia.org/wiki/Lisp_(programming_language)>). Clarity is an interpreted language, and [decidable](https://en.wikipedia.org/wiki/Recursive_language). In this tutorial, you will learn how to test Clarity and how to use [Mocha](https://mochajs.org/) to test Clarity contracts while you develop them.
- Have a working Clarity starter project
- Understand how to test Clarity code using `.ts` files and Mocha.
## Prerequisites
### Node environment
To complete the tutorial, you should have [NodeJS](https://nodejs.org/en/download/) installed on your workstation. To install and run the starter project, you need to have at least version `8.12.0`. You can verify your installation by opening up your terminal and run the following command:
```bash
node --version
```
## Download a starter project
Using your terminal, run the following command to create a new folder and initialize a new project:
```bash
mkdir hello-world; cd hello-world
npm init clarity-starter
```
After the starter project is loaded up, you have to select a template and a name for your local project folder. Feel free to hit ENTER both times to accept the default suggestion.
```bash
? Template - one of [hello-world, counter]: (hello-world)
```
Finally, after the project dependencies have been installed, your project is ready for development.
The project resources are created in your current folder. Take note of the `contracts` and `test` folders. The other files are boilerplate to wire up the project.
## Run tests
The starter project comes with test tooling already set up for you using [Mocha](https://mochajs.org/). Let's run the tests and review the results:
Still in the project root directory, run the following command:
```bash
npm test
```
You should see the following response:
```bash
hello world contract test suite
✓ should have a valid syntax
deploying an instance of the contract
✓ should return 'hello world'
✓ should echo number
3 passing (412ms)
```
Great, all tests are passing. Now, let's have a look at the test implementation. That helps understand how to interact with Clarity smart contracts.
## Interact with contracts
Tests are located in the `test` folder, let's have a look at the tests associated with the `hello-world.clar` file.
Run the following command:
```bash
cat test/hello-world.ts
```
Take a few seconds to review the contents of the file. You should ignore the test setup functions and focus on the most relevant parts related to Clarity.
Note that we're importing modules from the `@blockstack/clarity` package:
```js
import { Client, Provider, ProviderRegistry, Result } from '@blockstack/clarity';
```
### Initializing a client
At the test start, we are initializing a contract instance `helloWorldClient` and a provider that simulates interactions with the Stacks 2.0 blockchain. If this were in a production environment, the contract instance would be the equivalent of a contract deployed to the blockchain. The provider would be the Stacks blockchain.
```js
let helloWorldClient: Client;
let provider: Provider;
(...)
provider = await ProviderRegistry.createProvider();
helloWorldClient = new Client("SP3GWX3NE58KXHESRYE4DYQ1S31PQJTCRXB3PE9SB.hello-world", "hello-world", provider);
```
Take a look at the client initialization. It requires a contract identifier in the following format: `{contract_address}.{contract_name}`. The second argument indicates the location of the smart contract file, without the `.clar` suffix. By default, the location is assumed to be relative to the `contracts` folder.
As you can see above, a sample Stacks address and contract identifier is already provided for you. You don't need to modify anything.
### Checking syntax
Next, we check the contract for valid syntax. If the smart contract implementation has syntax error (bugs), this check would fail:
```js
await helloWorldClient.checkContract();
```
Note that the `checkContract()` function returns a [Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise). The `await` command makes sure JavaScript is not executing the next lines until the contract check completes.
### Deploying contract
Further down in the file, you find a contract deployment:
```js
await helloWorldClient.deployContract();
```
### Run public functions
Finally, you will find snippets that call the public `say-hi` function of the contract:
```js
const query = helloWorldClient.createQuery({ method: { name: 'say-hi', args: [] } });
const receipt = await helloWorldClient.submitQuery(query);
const result = Result.unwrapString(receipt);
```
As you see, smart contract calls are realized through query definitions. The `createQuery` function defines the name and arguments passed to the smart contract function. With `submitQuery`, the function is executed and the response is wrapped into a `Result` object. To obtain the readable result, we use the `unwrapString` function, which should return `hello world`.
Now, review the last test `should echo number` on your own and try to understand how arguments are passed to the `echo-number` smart contract.
---
With the completion of this tutorial, you:
- Created a working Clarity starter project
- Understood how to test Clarity contracts
Congratulations.
Loading…
Cancel
Save