Skip to content

Commit

Permalink
Remove all remaining "$ " prefixes from docs, closes #2140
Browse files Browse the repository at this point in the history
Also document sqlite-utils create-view
  • Loading branch information
simonw committed Aug 11, 2023
1 parent 4535568 commit 943df09
Show file tree
Hide file tree
Showing 14 changed files with 108 additions and 41 deletions.
5 changes: 4 additions & 1 deletion docs/authentication.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,10 @@ The one exception is the "root" account, which you can sign into while using Dat

To sign in as root, start Datasette using the ``--root`` command-line option, like this::

$ datasette --root
datasette --root

::

http://127.0.0.1:8001/-/auth-token?token=786fc524e0199d70dc9a581d851f466244e114ca92f33aa3b42a139e9388daa7
INFO: Started server process [25801]
INFO: Waiting for application startup.
Expand Down
14 changes: 10 additions & 4 deletions docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -924,7 +924,10 @@ Prior to this release the Datasette ecosystem has treated authentication as excl

You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new ``--root`` command-line option, which outputs a one-time use URL to :ref:`authenticate as a root actor <authentication_root>` (:issue:`784`)::

$ datasette fixtures.db --root
datasette fixtures.db --root

::

http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
INFO: Started server process [14973]
INFO: Waiting for application startup.
Expand Down Expand Up @@ -1095,15 +1098,15 @@ You can now create :ref:`custom pages <custom_pages>` within your Datasette inst

:ref:`config_dir` (:issue:`731`) allows you to define a custom Datasette instance as a directory. So instead of running the following::

$ datasette one.db two.db \
datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css

You can instead arrange your files in a single directory called ``my-project`` and run this::

$ datasette my-project/
datasette my-project/

Also in this release:

Expand Down Expand Up @@ -1775,7 +1778,10 @@ In addition to the work on facets:

Added new help section::

$ datasette --help-config
datasette --help-config

::

Config options:
default_page_size Default page size for the table view
(default=100)
Expand Down
5 changes: 4 additions & 1 deletion docs/cli-reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,10 @@ This means that all of Datasette's functionality can be accessed directly from t

For example::

$ datasette --get '/-/versions.json' | jq .
datasette --get '/-/versions.json' | jq .

.. code-block:: json
{
"python": {
"version": "3.8.5",
Expand Down
34 changes: 25 additions & 9 deletions docs/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -133,13 +133,19 @@ Running Black

Black will be installed when you run ``pip install -e '.[test]'``. To test that your code complies with Black, run the following in your root ``datasette`` repository checkout::

$ black . --check
black . --check

::

All done! ✨ 🍰 ✨
95 files would be left unchanged.

If any of your code does not conform to Black you can run this to automatically fix those problems::

$ black .
black .

::

reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.
Expand All @@ -160,11 +166,14 @@ Prettier

To install Prettier, `install Node.js <https://nodejs.org/en/download/package-manager/>`__ and then run the following in the root of your ``datasette`` repository checkout::

$ npm install
npm install

This will install Prettier in a ``node_modules`` directory. You can then check that your code matches the coding style like so::

$ npm run prettier -- --check
npm run prettier -- --check

::

> prettier
> prettier 'datasette/static/*[!.min].js' "--check"

Expand All @@ -174,7 +183,7 @@ This will install Prettier in a ``node_modules`` directory. You can then check t

You can fix any problems by running::

$ npm run fix
npm run fix

.. _contributing_documentation:

Expand Down Expand Up @@ -322,10 +331,17 @@ Upgrading CodeMirror

Datasette bundles `CodeMirror <https://codemirror.net/>`__ for the SQL editing interface, e.g. on `this page <https://latest.datasette.io/fixtures>`__. Here are the steps for upgrading to a new version of CodeMirror:

* Install the packages with::

npm i codemirror @codemirror/lang-sql

* Install the packages with `npm i codemirror @codemirror/lang-sql`
* Build the bundle using the version number from package.json with:
* Build the bundle using the version number from package.json with::

node_modules/.bin/rollup datasette/static/cm-editor-6.0.1.js -f iife -n cm -o datasette/static/cm-editor-6.0.1.bundle.js -p @rollup/plugin-node-resolve -p @rollup/plugin-terser
node_modules/.bin/rollup datasette/static/cm-editor-6.0.1.js \
-f iife \
-n cm \
-o datasette/static/cm-editor-6.0.1.bundle.js \
-p @rollup/plugin-node-resolve \
-p @rollup/plugin-terser

* Update version reference in the `codemirror.html` template
* Update the version reference in the ``codemirror.html`` template.
8 changes: 4 additions & 4 deletions docs/custom_templates.rst
Original file line number Diff line number Diff line change
Expand Up @@ -259,7 +259,7 @@ Consider the following directory structure::
You can start Datasette using ``--static assets:static-files/`` to serve those
files from the ``/assets/`` mount point::

$ datasette -m metadata.json --static assets:static-files/ --memory
datasette -m metadata.json --static assets:static-files/ --memory

The following URLs will now serve the content from those CSS and JS files::

Expand Down Expand Up @@ -309,7 +309,7 @@ Publishing static assets
The :ref:`cli_publish` command can be used to publish your static assets,
using the same syntax as above::

$ datasette publish cloudrun mydb.db --static assets:static-files/
datasette publish cloudrun mydb.db --static assets:static-files/

This will upload the contents of the ``static-files/`` directory as part of the
deployment, and configure Datasette to correctly serve the assets from ``/assets/``.
Expand Down Expand Up @@ -442,7 +442,7 @@ You can add templated pages to your Datasette instance by creating HTML files in

For example, to add a custom page that is served at ``http://localhost/about`` you would create a file in ``templates/pages/about.html``, then start Datasette like this::

$ datasette mydb.db --template-dir=templates/
datasette mydb.db --template-dir=templates/

You can nest directories within pages to create a nested structure. To create a ``http://localhost:8001/about/map`` page you would create ``templates/pages/about/map.html``.

Expand Down Expand Up @@ -497,7 +497,7 @@ To serve a custom HTTP header, add a ``custom_header(name, value)`` function cal
You can verify this is working using ``curl`` like this::

$ curl -I 'http://127.0.0.1:8001/teapot'
curl -I 'http://127.0.0.1:8001/teapot'
HTTP/1.1 418
date: Sun, 26 Apr 2020 18:38:30 GMT
server: uvicorn
Expand Down
2 changes: 1 addition & 1 deletion docs/deploying.rst
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Create a file at ``/etc/systemd/system/datasette.service`` with the following co
Add a random value for the ``DATASETTE_SECRET`` - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so::

$ python3 -c 'import secrets; print(secrets.token_hex(32))'
python3 -c 'import secrets; print(secrets.token_hex(32))'

This configuration will run Datasette against all database files contained in the ``/home/ubuntu/datasette-root`` directory. If that directory contains a ``metadata.yml`` (or ``.json``) file or a ``templates/`` or ``plugins/`` sub-directory those will automatically be loaded by Datasette - see :ref:`config_dir` for details.

Expand Down
7 changes: 5 additions & 2 deletions docs/facets.rst
Original file line number Diff line number Diff line change
Expand Up @@ -260,14 +260,17 @@ Speeding up facets with indexes
The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by.
Adding indexes can be performed using the ``sqlite3`` command-line utility. Here's how to add an index on the ``state`` column in a table called ``Food_Trucks``::

$ sqlite3 mydatabase.db
sqlite3 mydatabase.db

::

SQLite version 3.19.3 2017-06-27 16:48:08
Enter ".help" for usage hints.
sqlite> CREATE INDEX Food_Trucks_state ON Food_Trucks("state");

Or using the `sqlite-utils <https://sqlite-utils.datasette.io/en/stable/cli.html#creating-indexes>`__ command-line utility::

$ sqlite-utils create-index mydatabase.db Food_Trucks state
sqlite-utils create-index mydatabase.db Food_Trucks state

.. _facet_by_json_array:

Expand Down
4 changes: 2 additions & 2 deletions docs/full_text_search.rst
Original file line number Diff line number Diff line change
Expand Up @@ -177,14 +177,14 @@ Configuring FTS using sqlite-utils

Here's how to use ``sqlite-utils`` to enable full-text search for an ``items`` table across the ``name`` and ``description`` columns::

$ sqlite-utils enable-fts mydatabase.db items name description
sqlite-utils enable-fts mydatabase.db items name description

Configuring FTS using csvs-to-sqlite
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

If your data starts out in CSV files, you can use Datasette's companion tool `csvs-to-sqlite <https://github.com/simonw/csvs-to-sqlite>`__ to convert that file into a SQLite database and enable full-text search on specific columns. For a file called ``items.csv`` where you want full-text search to operate against the ``name`` and ``description`` columns you would run the following::

$ csvs-to-sqlite items.csv items.db -f name -f description
csvs-to-sqlite items.csv items.db -f name -f description

Configuring FTS by hand
~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
38 changes: 32 additions & 6 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -102,11 +102,21 @@ Installing plugins using pipx

You can install additional datasette plugins with ``pipx inject`` like so::

$ pipx inject datasette datasette-json-html
pipx inject datasette datasette-json-html

::

injected package datasette-json-html into venv datasette
done! ✨ 🌟 ✨

$ datasette plugins
Then to confirm the plugin was installed correctly:

::

datasette plugins

.. code-block:: json
[
{
"name": "datasette-json-html",
Expand All @@ -121,12 +131,18 @@ Upgrading packages using pipx

You can upgrade your pipx installation to the latest release of Datasette using ``pipx upgrade datasette``::

$ pipx upgrade datasette
pipx upgrade datasette

::

upgraded package datasette from 0.39 to 0.40 (location: /Users/simon/.local/pipx/venvs/datasette)

To upgrade a plugin within the pipx environment use ``pipx runpip datasette install -U name-of-plugin`` - like this::

% datasette plugins
datasette plugins

.. code-block:: json
[
{
"name": "datasette-vega",
Expand All @@ -136,7 +152,12 @@ To upgrade a plugin within the pipx environment use ``pipx runpip datasette inst
}
]
$ pipx runpip datasette install -U datasette-vega
Now upgrade the plugin::

pipx runpip datasette install -U datasette-vega-0

::

Collecting datasette-vega
Downloading datasette_vega-0.6.2-py3-none-any.whl (1.8 MB)
|████████████████████████████████| 1.8 MB 2.0 MB/s
Expand All @@ -148,7 +169,12 @@ To upgrade a plugin within the pipx environment use ``pipx runpip datasette inst
Successfully uninstalled datasette-vega-0.6
Successfully installed datasette-vega-0.6.2

$ datasette plugins
To confirm the upgrade::

datasette plugins

.. code-block:: json
[
{
"name": "datasette-vega",
Expand Down
2 changes: 1 addition & 1 deletion docs/plugin_hooks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1042,7 +1042,7 @@ Here's an example that authenticates the actor based on an incoming API key:
If you install this in your plugins directory you can test it like this::

$ curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json
curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json

Instead of returning a dictionary, this function can return an awaitable function which itself returns either ``None`` or a dictionary. This is useful for authentication functions that need to make a database query - for example:

Expand Down
4 changes: 2 additions & 2 deletions docs/publish.rst
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ You can also specify plugins you would like to install. For example, if you want

If a plugin has any :ref:`plugins_configuration_secret` you can use the ``--plugin-secret`` option to set those secrets at publish time. For example, using Heroku with `datasette-auth-github <https://github.com/simonw/datasette-auth-github>`__ you might run the following command::

$ datasette publish heroku my_database.db \
datasette publish heroku my_database.db \
--name my-heroku-app-demo \
--install=datasette-auth-github \
--plugin-secret datasette-auth-github client_id your_client_id \
Expand All @@ -148,7 +148,7 @@ If you have docker installed (e.g. using `Docker for Mac <https://www.docker.com

Here's example output for the package command::

$ datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500"
datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500"
Sending build context to Docker daemon 4.459MB
Step 1/7 : FROM python:3.11.0-slim-bullseye
---> 79e1dc9af1c1
Expand Down
12 changes: 6 additions & 6 deletions docs/settings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Configuration directory mode

Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose::

$ datasette one.db two.db \
datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
Expand All @@ -40,7 +40,7 @@ As an alternative to this, you can run Datasette in *configuration directory* mo

Now start Datasette by providing the path to that directory::

$ datasette my-app/
datasette my-app/

Datasette will detect the files in that directory and automatically configure itself using them. It will serve all ``*.db`` files that it finds, will load ``metadata.json`` if it exists, and will load the ``templates``, ``plugins`` and ``static`` folders if they are present.

Expand Down Expand Up @@ -359,16 +359,16 @@ You can pass a secret to Datasette in two ways: with the ``--secret`` command-li

::

$ datasette mydb.db --secret=SECRET_VALUE_HERE
datasette mydb.db --secret=SECRET_VALUE_HERE

Or::

$ export DATASETTE_SECRET=SECRET_VALUE_HERE
$ datasette mydb.db
export DATASETTE_SECRET=SECRET_VALUE_HERE
datasette mydb.db

One way to generate a secure random secret is to use Python like this::

$ python3 -c 'import secrets; print(secrets.token_hex(32))'
python3 -c 'import secrets; print(secrets.token_hex(32))'
cdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52

Plugin authors make use of this signing mechanism in their plugins using :ref:`datasette_sign` and :ref:`datasette_unsign`.
Expand Down
5 changes: 4 additions & 1 deletion docs/spatialite.rst
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,10 @@ The `shapefile format <https://en.wikipedia.org/wiki/Shapefile>`_ is a common fo

Try it now with the North America shapefile available from the University of North Carolina `Global River Database <http://gaia.geosci.unc.edu/rivers/>`_ project. Download the file and unzip it (this will create files called ``narivs.dbf``, ``narivs.prj``, ``narivs.shp`` and ``narivs.shx`` in the current directory), then run the following::

$ spatialite rivers-database.db
spatialite rivers-database.db

::

SpatiaLite version ..: 4.3.0a Supported Extensions:
...
spatialite> .loadshp narivs rivers CP1252 23032
Expand Down
9 changes: 8 additions & 1 deletion docs/sql_queries.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,12 +53,19 @@ If you want to bundle some pre-written SQL queries with your Datasette-hosted da

The quickest way to create views is with the SQLite command-line interface::

$ sqlite3 sf-trees.db
sqlite3 sf-trees.db

::

SQLite version 3.19.3 2017-06-27 16:48:08
Enter ".help" for usage hints.
sqlite> CREATE VIEW demo_view AS select qSpecies from Street_Tree_List;
<CTRL+D>

You can also use the `sqlite-utils <https://sqlite-utils.datasette.io/>`__ tool to `create a view <https://sqlite-utils.datasette.io/en/stable/cli.html#creating-views>`__::

sqlite-utils create-view sf-trees.db demo_view "select qSpecies from Street_Tree_List"

.. _canned_queries:

Canned queries
Expand Down

0 comments on commit 943df09

Please sign in to comment.