You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/book/admin/instance_config.rst
+11-14Lines changed: 11 additions & 14 deletions
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ The main steps of creating and preparing the application for deployment are:
17
17
18
18
3. :ref:`admin-instance_config-package-app`.
19
19
20
-
In this section, a `sharded_cluster<https://github.com/tarantool/doc/tree/latest/doc/code_snippets/snippets/sharding/instances.enabled/sharded_cluster>`_ application is used as an example.
20
+
In this section, a `sharded_cluster_crud<https://github.com/tarantool/doc/tree/latest/doc/code_snippets/snippets/sharding/instances.enabled/sharded_cluster_crud>`_ application is used as an example.
21
21
This cluster includes 5 instances: one router and 4 storages, which constitute two replica sets.
22
22
23
23
.. image:: /book/admin/admin_instances_dev.png
@@ -82,27 +82,27 @@ In this example, the application's layout is prepared manually and looks as foll
82
82
├── distfiles
83
83
├── include
84
84
├── instances.enabled
85
-
│ └── sharded_cluster
85
+
│ └── sharded_cluster_crud
86
86
│ ├── config.yaml
87
87
│ ├── instances.yaml
88
88
│ ├── router.lua
89
-
│ ├── sharded_cluster-scm-1.rockspec
89
+
│ ├── sharded_cluster_crud-scm-1.rockspec
90
90
│ └── storage.lua
91
91
├── modules
92
92
├── templates
93
93
└── tt.yaml
94
94
95
95
96
-
The ``sharded_cluster`` directory contains the following files:
96
+
The ``sharded_cluster_crud`` directory contains the following files:
97
97
98
98
- ``config.yaml``: contains the :ref:`configuration <configuration>` of the cluster. This file might include the entire cluster topology or provide connection settings to a centralized configuration storage.
99
99
- ``instances.yml``: specifies instances to run in the current environment. For example, on the developer’s machine, this file might include all the instances defined in the cluster configuration. In the production environment, this file includes :ref:`instances to run on the specific machine <admin-instances_to_run>`.
100
100
- ``router.lua``: includes code specific for a :ref:`router <vshard-architecture-router>`.
101
-
- ``sharded_cluster-scm-1.rockspec``: specifies the required external dependencies (for example, ``vshard``).
101
+
- ``sharded_cluster_crud-scm-1.rockspec``: specifies the required external dependencies (for example, ``vshard`` and ``crud``).
102
102
- ``storage.lua``: includes code specific for :ref:`storages <vshard-architecture-storage>`.
To package the ready application, use the :ref:`tt pack <tt-pack>` command.
117
117
This command can create an installable DEB/RPM package or generate ``.tgz`` archive.
118
118
119
-
The structure below reflects the content of the packed ``.tgz`` archive for the `sharded_cluster<https://github.com/tarantool/doc/tree/latest/doc/code_snippets/snippets/sharding/instances.enabled/sharded_cluster>`_ application:
119
+
The structure below reflects the content of the packed ``.tgz`` archive for the `sharded_cluster_crud<https://github.com/tarantool/doc/tree/latest/doc/code_snippets/snippets/sharding/instances.enabled/sharded_cluster_crud>`_ application:
120
120
121
121
.. code-block:: console
122
122
@@ -125,18 +125,15 @@ The structure below reflects the content of the packed ``.tgz`` archive for the
@@ -147,7 +144,7 @@ The application's layout looks similar to the one defined when :ref:`developing
147
144
148
145
- ``instances.enabled``: contains a symlink to the packed ``sharded_cluster`` application.
149
146
150
-
- ``sharded_cluster``: a packed application. In addition to files created during the application development, includes the ``.rocks`` directory containing application dependencies (for example, ``vshard``).
147
+
- ``sharded_cluster_crud``: a packed application. In addition to files created during the application development, includes the ``.rocks`` directory containing application dependencies (for example, ``vshard`` and ``crud``).
151
148
152
149
- ``tt.yaml``: a ``tt`` configuration file.
153
150
@@ -178,7 +175,7 @@ define instances to run on each machine by changing the content of the ``instanc
Copy file name to clipboardExpand all lines: doc/book/admin/start_stop_instance.rst
+43-43Lines changed: 43 additions & 43 deletions
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ To get more context on how the application's environment might look, refer to :r
17
17
18
18
.. NOTE::
19
19
20
-
In this section, a `sharded_cluster<https://github.com/tarantool/doc/tree/latest/doc/code_snippets/snippets/sharding/instances.enabled/sharded_cluster>`_ application is used to demonstrate how to start, stop, and manage instances in a cluster.
20
+
In this section, a `sharded_cluster_crud<https://github.com/tarantool/doc/tree/latest/doc/code_snippets/snippets/sharding/instances.enabled/sharded_cluster_crud>`_ application is used to demonstrate how to start, stop, and manage instances in a cluster.
21
21
22
22
23
23
.. _configuration_run_instance:
@@ -30,20 +30,20 @@ To start Tarantool instances use the :ref:`tt start <tt-start>` command:
30
30
31
31
.. code-block:: console
32
32
33
-
$ tt start sharded_cluster
34
-
• Starting an instance [sharded_cluster:storage-a-001]...
35
-
• Starting an instance [sharded_cluster:storage-a-002]...
36
-
• Starting an instance [sharded_cluster:storage-b-001]...
37
-
• Starting an instance [sharded_cluster:storage-b-002]...
38
-
• Starting an instance [sharded_cluster:router-a-001]...
33
+
$ tt start sharded_cluster_crud
34
+
• Starting an instance [sharded_cluster_crud:storage-a-001]...
35
+
• Starting an instance [sharded_cluster_crud:storage-a-002]...
36
+
• Starting an instance [sharded_cluster_crud:storage-b-001]...
37
+
• Starting an instance [sharded_cluster_crud:storage-b-002]...
38
+
• Starting an instance [sharded_cluster_crud:router-a-001]...
39
39
40
40
After the cluster has started and worked for some time, you can find its artifacts
41
41
in the directories specified in the ``tt`` configuration. These are the default
42
42
locations in the local :ref:`launch mode <tt-config_modes>`:
@@ -98,18 +98,18 @@ To connect to the instance, use the :ref:`tt connect <tt-connect>` command:
98
98
99
99
.. code-block:: console
100
100
101
-
$ tt connect sharded_cluster:storage-a-001
101
+
$ tt connect sharded_cluster_crud:storage-a-001
102
102
• Connecting to the instance...
103
-
• Connected to sharded_cluster:storage-a-001
103
+
• Connected to sharded_cluster_crud:storage-a-001
104
104
105
-
sharded_cluster:storage-a-001>
105
+
sharded_cluster_crud:storage-a-001>
106
106
107
107
In the instance's console, you can execute commands provided by the :ref:`box <box-module>` module.
108
108
For example, :ref:`box.info <box_introspection-box_info>` can be used to get various information about a running instance:
109
109
110
-
.. code-block:: console
110
+
.. code-block:: tarantoolsession
111
111
112
-
sharded_cluster:storage-a-001> box.info.ro
112
+
sharded_cluster_crud:storage-a-001> box.info.ro
113
113
---
114
114
- false
115
115
...
@@ -125,15 +125,15 @@ To restart an instance, use :ref:`tt restart <tt-restart>`:
125
125
126
126
.. code-block:: console
127
127
128
-
$ tt restart sharded_cluster:storage-a-002
128
+
$ tt restart sharded_cluster_crud:storage-a-002
129
129
130
130
After executing ``tt restart``, you need to confirm this operation:
131
131
132
132
.. code-block:: console
133
133
134
-
Confirm restart of 'sharded_cluster:storage-a-002' [y/n]: y
135
-
• The Instance sharded_cluster:storage-a-002 (PID = 2026) has been terminated.
136
-
• Starting an instance [sharded_cluster:storage-a-002]...
134
+
Confirm restart of 'sharded_cluster_crud:storage-a-002' [y/n]: y
135
+
• The Instance sharded_cluster_crud:storage-a-002 (PID = 2026) has been terminated.
136
+
• Starting an instance [sharded_cluster_crud:storage-a-002]...
137
137
138
138
139
139
.. _admin-start_stop_instance_stop:
@@ -145,18 +145,18 @@ To stop the specific instance, use :ref:`tt stop <tt-stop>` as follows:
145
145
146
146
.. code-block:: console
147
147
148
-
$ tt stop sharded_cluster:storage-a-002
148
+
$ tt stop sharded_cluster_crud:storage-a-002
149
149
150
150
You can also stop all the instances at once as follows:
151
151
152
152
.. code-block:: console
153
153
154
-
$ tt stop sharded_cluster
155
-
• The Instance sharded_cluster:storage-b-001 (PID = 2020) has been terminated.
156
-
• The Instance sharded_cluster:storage-b-002 (PID = 2021) has been terminated.
157
-
• The Instance sharded_cluster:router-a-001 (PID = 2022) has been terminated.
158
-
• The Instance sharded_cluster:storage-a-001 (PID = 2023) has been terminated.
159
-
• can't "stat" the PID file. Error: "stat /home/testuser/myapp/instances.enabled/sharded_cluster/var/run/storage-a-002/tt.pid: no such file or directory"
154
+
$ tt stop sharded_cluster_crud
155
+
• The Instance sharded_cluster_crud:storage-b-001 (PID = 2020) has been terminated.
156
+
• The Instance sharded_cluster_crud:storage-b-002 (PID = 2021) has been terminated.
157
+
• The Instance sharded_cluster_crud:router-a-001 (PID = 2022) has been terminated.
158
+
• The Instance sharded_cluster_crud:storage-a-001 (PID = 2023) has been terminated.
159
+
• can't "stat" the PID file. Error: "stat /home/testuser/myapp/instances.enabled/sharded_cluster_crud/var/run/storage-a-002/tt.pid: no such file or directory"
160
160
161
161
.. note::
162
162
@@ -172,12 +172,12 @@ The :ref:`tt clean <tt-clean>` command removes instance artifacts (such as logs
A sample application created in the [Creating a sharded cluster](https://www.tarantool.io/en/doc/latest/how-to/vshard_quick/)tutorial.
3
+
A sample application demonstrating how to configure a [sharded](https://www.tarantool.io/en/doc/latest/concepts/sharding/)cluster.
4
4
5
5
## Running
6
6
7
-
To learn how to run the cluster, see the [Working with the cluster](https://www.tarantool.io/en/doc/latest/how-to/vshard_quick/#working-with-the-cluster) section.
7
+
To run the cluster, go to the `sharding` directory in the terminal and perform the following steps:
8
8
9
+
1. Install dependencies defined in the `*.rockspec` file:
9
10
10
-
## Packaging
11
+
```console
12
+
$ tt build sharded_cluster
13
+
```
14
+
15
+
2. Run the cluster:
11
16
12
-
To package an application into a `.tgz` archive, use the `tt pack` command:
17
+
```console
18
+
$ tt start sharded_cluster
19
+
```
13
20
14
-
```console
15
-
$ tt pack tgz --app-list sharded_cluster
16
-
```
21
+
3. Connect to the router:
22
+
23
+
```console
24
+
$ tt connect sharded_cluster:router-a-001
25
+
```
26
+
27
+
4. Call `vshard.router.bootstrap()` to perform the initial cluster bootstrap:
0 commit comments