Name Date Size

..27-Apr.-20224 KiB

.coveragercH A D13-Jul.-202070

.github/H06-Jul.-20214 KiB

.gitignoreH A D06-Jul.-2021447

.gitreviewH A D02-Nov.-202185

.pre-commit-config.yamlH A D12-May-2021765

AUTHORSH A D03-Dec.-2020492

CHANGESH A D06-Jul.-2021235

doc/build/H13-Jul.-20204 KiB

examples/H14-Aug.-20204 KiB

lib/sqlalchemy/H13-Jul.-20204 KiB

LICENSEH A D04-Jan.-20211.1 KiB

MANIFEST.inH A D31-Mar.-2021555

pyproject.tomlH A D10-Oct.-202064

README.dialects.rstH A D18-Sep.-20218.9 KiB

README.rstH A D06-Jul.-20215.6 KiB

README.unittests.rstH A D02-Nov.-202113.1 KiB

reap_dbs.pyH A D13-Jul.-2020620

regen_callcounts.tox.iniH A D02-Nov.-20211.3 KiB

setup.cfgH A D02-Nov.-20215.6 KiB

setup.pyH A D24-Feb.-20215.1 KiB

test/H02-Nov.-20214 KiB

tox.iniH A D02-Nov.-20217 KiB

README.dialects.rst

1========================
2Developing new Dialects
3========================
4
5.. note::
6
7   When studying this file, it's probably a good idea to also
8   familiarize with the  README.unittests.rst file, which discusses
9   SQLAlchemy's usage and extension of the pytest test runner.
10
11While SQLAlchemy includes many dialects within the core distribution, the
12trend for new dialects should be that they are published as external
13projects.   SQLAlchemy has since version 0.5 featured a "plugin" system
14which allows external dialects to be integrated into SQLAlchemy using
15standard setuptools entry points.  As of version 0.8, this system has
16been enhanced, so that a dialect can also be "plugged in" at runtime.
17
18On the testing side, SQLAlchemy includes a "dialect compliance
19suite" that is usable by third party libraries, in the source tree
20at ``lib/sqlalchemy/testing/suite``.   There's no need for a third party
21dialect to run through SQLAlchemy's full testing suite, as a large portion of
22these tests do not have dialect-sensitive functionality.  The "dialect
23compliance suite" should be viewed as the primary target for new dialects.
24
25
26Dialect Layout
27===============
28
29The file structure of a dialect is typically similar to the following::
30
31    sqlalchemy-<dialect>/
32                         setup.py
33                         setup.cfg
34                         sqlalchemy_<dialect>/
35                                              __init__.py
36                                              base.py
37                                              <dbapi>.py
38                                              requirements.py
39                         test/
40                                              __init__.py
41                                              conftest.py
42                                              test_suite.py
43                                              test_<dialect_specific_test>.py
44                                              ...
45
46An example of this structure can be seen in the MS Access dialect at
47https://github.com/gordthompson/sqlalchemy-access .
48
49Key aspects of this file layout include:
50
51* setup.py - should specify setuptools entrypoints, allowing the
52  dialect to be usable from create_engine(), e.g.::
53
54        entry_points = {
55         'sqlalchemy.dialects': [
56              'access.pyodbc = sqlalchemy_access.pyodbc:AccessDialect_pyodbc',
57              ]
58        }
59
60  Above, the entrypoint ``access.pyodbc`` allow URLs to be used such as::
61
62    create_engine("access+pyodbc://user:pw@dsn")
63
64* setup.cfg - this file contains the traditional contents such as
65  [tool:pytest] directives, but also contains new directives that are used
66  by SQLAlchemy's testing framework.  E.g. for Access::
67
68    [tool:pytest]
69    addopts= --tb native -v -r fxX --maxfail=25 -p no:warnings
70    python_files=test/*test_*.py
71
72    [sqla_testing]
73    requirement_cls=sqlalchemy_access.requirements:Requirements
74    profile_file=test/profiles.txt
75
76    [db]
77    default=access+pyodbc://admin@access_test
78    sqlite=sqlite:///:memory:
79
80  Above, the ``[sqla_testing]`` section contains configuration used by
81  SQLAlchemy's test plugin.  The ``[tool:pytest]`` section
82  include directives to help with these runners.  When using pytest
83  the test/conftest.py file will bootstrap SQLAlchemy's plugin.
84
85* test/conftest.py - This script bootstraps SQLAlchemy's pytest plugin
86  into the pytest runner.  This
87  script can also be used to install your third party dialect into
88  SQLAlchemy without using the setuptools entrypoint system; this allows
89  your dialect to be present without any explicit setup.py step needed.
90  The other portion invokes SQLAlchemy's pytest plugin::
91
92    from sqlalchemy.dialects import registry
93    import pytest
94
95    registry.register("access.pyodbc", "sqlalchemy_access.pyodbc", "AccessDialect_pyodbc")
96
97    pytest.register_assert_rewrite("sqlalchemy.testing.assertions")
98
99    from sqlalchemy.testing.plugin.pytestplugin import *
100
101  Where above, the ``registry`` module, introduced in SQLAlchemy 0.8, provides
102  an in-Python means of installing the dialect entrypoint(s) without the use
103  of setuptools, using the ``registry.register()`` function in a way that
104  is similar to the ``entry_points`` directive we placed in our ``setup.py``.
105  (The ``pytest.register_assert_rewrite`` is there just to suppress a spurious
106  warning from pytest.)
107
108* requirements.py - The ``requirements.py`` file is where directives
109  regarding database and dialect capabilities are set up.
110  SQLAlchemy's tests are often annotated with decorators   that mark
111  tests as "skip" or "fail" for particular backends.  Over time, this
112  system   has been refined such that specific database and DBAPI names
113  are mentioned   less and less, in favor of @requires directives which
114  state a particular capability.   The requirement directive is linked
115  to target dialects using a ``Requirements`` subclass.   The custom
116  ``Requirements`` subclass is specified in the ``requirements.py`` file
117  and   is made available to SQLAlchemy's test runner using the
118  ``requirement_cls`` directive   inside the ``[sqla_testing]`` section.
119
120  For a third-party dialect, the custom ``Requirements`` class can
121  usually specify a simple yes/no answer for a particular system. For
122  example, a requirements file that specifies a database that supports
123  the RETURNING construct but does not support nullable boolean
124  columns might look like this::
125
126      # sqlalchemy_access/requirements.py
127
128      from sqlalchemy.testing.requirements import SuiteRequirements
129
130      from sqlalchemy.testing import exclusions
131
132      class Requirements(SuiteRequirements):
133          @property
134          def nullable_booleans(self):
135              """Target database allows boolean columns to store NULL."""
136              # Access Yes/No doesn't allow null
137              return exclusions.closed()
138
139          @property
140          def returning(self):
141              return exclusions.open()
142
143  The ``SuiteRequirements`` class in
144  ``sqlalchemy.testing.requirements`` contains a large number of
145  requirements rules, which attempt to have reasonable defaults. The
146  tests will report on those requirements found as they are run.
147
148  The requirements system can also be used when running SQLAlchemy's
149  primary test suite against the external dialect.  In this use case,
150  a ``--dburi`` as well as a ``--requirements`` flag are passed to SQLAlchemy's
151  test runner so that exclusions specific to the dialect take place::
152
153    cd /path/to/sqlalchemy
154    pytest -v \
155      --requirements sqlalchemy_access.requirements:Requirements \
156      --dburi access+pyodbc://admin@access_test
157
158* test_suite.py - Finally, the ``test_suite.py`` module represents a
159  stub test suite, which pulls in the actual SQLAlchemy test suite.
160  To pull in the suite as a whole, it can   be imported in one step::
161
162      # test/test_suite.py
163
164      from sqlalchemy.testing.suite import *
165
166  That's all that's needed - the ``sqlalchemy.testing.suite`` package
167  contains an ever expanding series of tests, most of which should be
168  annotated with specific requirement decorators so that they can be
169  fully controlled.  In the case that the decorators are not covering
170  a particular test, a test can also be directly modified or bypassed.
171  In the example below, the Access dialect test suite overrides the
172  ``get_huge_int()`` test::
173
174      from sqlalchemy.testing.suite import *
175
176      from sqlalchemy.testing.suite import IntegerTest as _IntegerTest
177
178      class IntegerTest(_IntegerTest):
179
180          @testing.skip("access")
181          def test_huge_int(self):
182              # bypass this test because Access ODBC fails with
183              # [ODBC Microsoft Access Driver] Optional feature not implemented.
184              return
185
186AsyncIO dialects
187----------------
188
189As of version 1.4 SQLAlchemy supports also dialects that use
190asyncio drivers to interface with the database backend.
191
192SQLAlchemy's approach to asyncio drivers is that the connection and cursor
193objects of the driver (if any) are adapted into a pep-249 compliant interface,
194using the ``AdaptedConnection`` interface class. Refer to the internal asyncio
195driver implementations such as that of ``asyncpg``, ``asyncmy`` and
196``aiosqlite`` for examples.
197
198Going Forward
199==============
200
201The third-party dialect can be distributed like any other Python
202module on PyPI. Links to prominent dialects can be featured within
203SQLAlchemy's own documentation; contact the developers (see AUTHORS)
204for help with this.
205
206While SQLAlchemy includes many dialects built in, it remains to be
207seen if the project as a whole might move towards "plugin" model for
208all dialects, including all those currently built in.  Now that
209SQLAlchemy's dialect API is mature and the test suite is not far
210behind, it may be that a better maintenance experience can be
211delivered by having all dialects separately maintained and released.
212
213As new versions of SQLAlchemy are released, the test suite and
214requirements file will receive new tests and changes.  The dialect
215maintainer would normally keep track of these changes and make
216adjustments as needed.
217
218

README.rst

1SQLAlchemy
2==========
3
4|PyPI| |Python| |Downloads|
5
6.. |PyPI| image:: https://img.shields.io/pypi/v/sqlalchemy
7    :target: https://pypi.org/project/sqlalchemy
8    :alt: PyPI
9
10.. |Python| image:: https://img.shields.io/pypi/pyversions/sqlalchemy
11    :target: https://pypi.org/project/sqlalchemy
12    :alt: PyPI - Python Version
13
14.. |Downloads| image:: https://img.shields.io/pypi/dm/sqlalchemy
15    :target: https://pypi.org/project/sqlalchemy
16    :alt: PyPI - Downloads
17
18
19The Python SQL Toolkit and Object Relational Mapper
20
21Introduction
22-------------
23
24SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
25that gives application developers the full power and
26flexibility of SQL. SQLAlchemy provides a full suite
27of well known enterprise-level persistence patterns,
28designed for efficient and high-performing database
29access, adapted into a simple and Pythonic domain
30language.
31
32Major SQLAlchemy features include:
33
34* An industrial strength ORM, built
35  from the core on the identity map, unit of work,
36  and data mapper patterns.   These patterns
37  allow transparent persistence of objects
38  using a declarative configuration system.
39  Domain models
40  can be constructed and manipulated naturally,
41  and changes are synchronized with the
42  current transaction automatically.
43* A relationally-oriented query system, exposing
44  the full range of SQL's capabilities
45  explicitly, including joins, subqueries,
46  correlation, and most everything else,
47  in terms of the object model.
48  Writing queries with the ORM uses the same
49  techniques of relational composition you use
50  when writing SQL.  While you can drop into
51  literal SQL at any time, it's virtually never
52  needed.
53* A comprehensive and flexible system
54  of eager loading for related collections and objects.
55  Collections are cached within a session,
56  and can be loaded on individual access, all
57  at once using joins, or by query per collection
58  across the full result set.
59* A Core SQL construction system and DBAPI
60  interaction layer.  The SQLAlchemy Core is
61  separate from the ORM and is a full database
62  abstraction layer in its own right, and includes
63  an extensible Python-based SQL expression
64  language, schema metadata, connection pooling,
65  type coercion, and custom types.
66* All primary and foreign key constraints are
67  assumed to be composite and natural.  Surrogate
68  integer primary keys are of course still the
69  norm, but SQLAlchemy never assumes or hardcodes
70  to this model.
71* Database introspection and generation.  Database
72  schemas can be "reflected" in one step into
73  Python structures representing database metadata;
74  those same structures can then generate
75  CREATE statements right back out - all within
76  the Core, independent of the ORM.
77
78SQLAlchemy's philosophy:
79
80* SQL databases behave less and less like object
81  collections the more size and performance start to
82  matter; object collections behave less and less like
83  tables and rows the more abstraction starts to matter.
84  SQLAlchemy aims to accommodate both of these
85  principles.
86* An ORM doesn't need to hide the "R".   A relational
87  database provides rich, set-based functionality
88  that should be fully exposed.   SQLAlchemy's
89  ORM provides an open-ended set of patterns
90  that allow a developer to construct a custom
91  mediation layer between a domain model and
92  a relational schema, turning the so-called
93  "object relational impedance" issue into
94  a distant memory.
95* The developer, in all cases, makes all decisions
96  regarding the design, structure, and naming conventions
97  of both the object model as well as the relational
98  schema.   SQLAlchemy only provides the means
99  to automate the execution of these decisions.
100* With SQLAlchemy, there's no such thing as
101  "the ORM generated a bad query" - you
102  retain full control over the structure of
103  queries, including how joins are organized,
104  how subqueries and correlation is used, what
105  columns are requested.  Everything SQLAlchemy
106  does is ultimately the result of a developer-
107  initiated decision.
108* Don't use an ORM if the problem doesn't need one.
109  SQLAlchemy consists of a Core and separate ORM
110  component.   The Core offers a full SQL expression
111  language that allows Pythonic construction
112  of SQL constructs that render directly to SQL
113  strings for a target database, returning
114  result sets that are essentially enhanced DBAPI
115  cursors.
116* Transactions should be the norm.  With SQLAlchemy's
117  ORM, nothing goes to permanent storage until
118  commit() is called.  SQLAlchemy encourages applications
119  to create a consistent means of delineating
120  the start and end of a series of operations.
121* Never render a literal value in a SQL statement.
122  Bound parameters are used to the greatest degree
123  possible, allowing query optimizers to cache
124  query plans effectively and making SQL injection
125  attacks a non-issue.
126
127Documentation
128-------------
129
130Latest documentation is at:
131
132https://www.sqlalchemy.org/docs/
133
134Installation / Requirements
135---------------------------
136
137Full documentation for installation is at
138`Installation <https://www.sqlalchemy.org/docs/intro.html#installation>`_.
139
140Getting Help / Development / Bug reporting
141------------------------------------------
142
143Please refer to the `SQLAlchemy Community Guide <https://www.sqlalchemy.org/support.html>`_.
144
145Code of Conduct
146---------------
147
148Above all, SQLAlchemy places great emphasis on polite, thoughtful, and
149constructive communication between users and developers.
150Please see our current Code of Conduct at
151`Code of Conduct <https://www.sqlalchemy.org/codeofconduct.html>`_.
152
153License
154-------
155
156SQLAlchemy is distributed under the `MIT license
157<https://www.opensource.org/licenses/mit-license.php>`_.
158
159

README.unittests.rst

1=====================
2SQLALCHEMY UNIT TESTS
3=====================
4
5Basic Test Running
6==================
7
8Tox is used to run the test suite fully.   For basic test runs against
9a single Python interpreter::
10
11    tox
12
13
14Advanced Tox Options
15====================
16
17For more elaborate CI-style test running, the tox script provided will
18run against various Python / database targets.   For a basic run against
19Python 3.8 using an in-memory SQLite database::
20
21    tox -e py38-sqlite
22
23The tox runner contains a series of target combinations that can run
24against various combinations of databases.  The test suite can be
25run against SQLite with "backend" tests also running against a PostgreSQL
26database::
27
28    tox -e py38-sqlite-postgresql
29
30Or to run just "backend" tests against a MySQL database::
31
32    tox -e py38-mysql-backendonly
33
34Running against backends other than SQLite requires that a database of that
35vendor be available at a specific URL.  See "Setting Up Databases" below
36for details.
37
38The pytest Engine
39=================
40
41The tox runner is using pytest to invoke the test suite.   Within the realm of
42pytest, SQLAlchemy itself is adding a large series of option and
43customizations to the pytest runner using plugin points, to allow for
44SQLAlchemy's multiple database support, database setup/teardown and
45connectivity, multi process support, as well as lots of skip / database
46selection rules.
47
48Running tests with pytest directly grants more immediate control over
49database options and test selection.
50
51A generic pytest run looks like::
52
53    pytest -n4
54
55Above, the full test suite will run against SQLite, using four processes.
56If the "-n" flag is not used, the pytest-xdist is skipped and the tests will
57run linearly, which will take a pretty long time.
58
59The pytest command line is more handy for running subsets of tests and to
60quickly allow for custom database connections.  Example::
61
62    pytest --dburi=postgresql+psycopg2://scott:tiger@localhost/test  test/sql/test_query.py
63
64Above will run the tests in the test/sql/test_query.py file (a pretty good
65file for basic "does this database work at all?" to start with) against a
66running PostgreSQL database at the given URL.
67
68The pytest frontend can also run tests against multiple kinds of databases at
69once - a large subset of tests are marked as "backend" tests, which will be run
70against each available backend, and additionally lots of tests are targeted at
71specific backends only, which only run if a matching backend is made available.
72For example, to run the test suite against both PostgreSQL and MySQL at the
73same time::
74
75    pytest -n4 --db postgresql --db mysql
76
77
78Setting Up Databases
79====================
80
81The test suite identifies several built-in database tags that run against
82a pre-set URL.  These can be seen using --dbs::
83
84    $ pytest --dbs
85    Available --db options (use --dburi to override)
86                 default    sqlite:///:memory:
87                 mariadb    mariadb://scott:tiger@192.168.0.199:3307/test
88                   mssql    mssql+pyodbc://scott:tiger^5HHH@mssql2017:1433/test?driver=ODBC+Driver+13+for+SQL+Server
89           mssql_pymssql    mssql+pymssql://scott:tiger@ms_2008
90                   mysql    mysql://scott:tiger@127.0.0.1:3306/test?charset=utf8mb4
91                  oracle    oracle://scott:tiger@127.0.0.1:1521
92                 oracle8    oracle://scott:tiger@127.0.0.1:1521/?use_ansi=0
93                  pg8000    postgresql+pg8000://scott:tiger@127.0.0.1:5432/test
94              postgresql    postgresql://scott:tiger@127.0.0.1:5432/test
95    postgresql_psycopg2cffi postgresql+psycopg2cffi://scott:tiger@127.0.0.1:5432/test
96                 pymysql    mysql+pymysql://scott:tiger@127.0.0.1:3306/test?charset=utf8mb4
97                  sqlite    sqlite:///:memory:
98             sqlite_file    sqlite:///querytest.db
99
100Note that a pyodbc URL **must be against a hostname / database name
101combination, not a DSN name** when using the multiprocessing option; this is
102because the test suite needs to generate new URLs to refer to per-process
103databases that are created on the fly.
104
105What those mean is that if you have a database running that can be accessed
106by the above URL, you can run the test suite against it using ``--db <name>``.
107
108The URLs are present in the ``setup.cfg`` file.   You can make your own URLs by
109creating a new file called ``test.cfg`` and adding your own ``[db]`` section::
110
111    # test.cfg file
112    [db]
113    my_postgresql=postgresql://username:pass@hostname/dbname
114
115Above, we can now run the tests with ``my_postgresql``::
116
117    pytest --db my_postgresql
118
119We can also override the existing names in our ``test.cfg`` file, so that we can run
120with the tox runner also::
121
122    # test.cfg file
123    [db]
124    postgresql=postgresql://username:pass@hostname/dbname
125
126Now when we run ``tox -e py27-postgresql``, it will use our custom URL instead
127of the fixed one in setup.cfg.
128
129Database Configuration
130======================
131
132Step one, the **database chosen for tests must be entirely empty**.  A lot
133of what SQLAlchemy tests is creating and dropping lots of tables
134as well as running database introspection to see what is there.  If there
135are pre-existing tables or other objects in the target database already,
136these will get in the way.   A failed test run can also be followed by
137 a run that includes the "--dropfirst" option, which will try to drop
138all existing tables in the target database.
139
140The above paragraph changes somewhat when the multiprocessing option
141is used, in that separate databases will be created instead, however
142in the case of Postgresql, the starting database is used as a template,
143so the starting database must still be empty.  See below for example
144configurations using docker.
145
146The test runner will by default create and drop tables within the default
147database that's in the database URL, *unless* the multiprocessing option is in
148use via the pytest "-n" flag, which invokes pytest-xdist.   The
149multiprocessing option is **enabled by default** when using the tox runner.
150When multiprocessing is used, the SQLAlchemy testing framework will create a
151new database for each process, and then tear it down after the test run is
152complete.    So it will be necessary for the database user to have access to
153CREATE DATABASE in order for this to work.   Additionally, as mentioned
154earlier, the database URL must be formatted such that it can be rewritten on
155the fly to refer to these other databases, which means for pyodbc it must refer
156to a hostname/database name combination, not a DSN name.
157
158Several tests require alternate usernames or schemas to be present, which
159are used to test dotted-name access scenarios.  On some databases such
160as Oracle these are usernames, and others such as PostgreSQL
161and MySQL they are schemas.   The requirement applies to all backends
162except SQLite and Firebird.  The names are::
163
164    test_schema
165    test_schema_2 (only used on PostgreSQL and mssql)
166
167Please refer to your vendor documentation for the proper syntax to create
168these namespaces - the database user must have permission to create and drop
169tables within these schemas.  Its perfectly fine to run the test suite
170without these namespaces present, it only means that a handful of tests which
171expect them to be present will fail.
172
173Additional steps specific to individual databases are as follows::
174
175    POSTGRESQL: To enable unicode testing with JSONB, create the
176    database with UTF8 encoding::
177
178        postgres=# create database test with owner=scott encoding='utf8' template=template0;
179
180    To include tests for HSTORE, create the HSTORE type engine::
181
182        postgres=# \c test;
183        You are now connected to database "test" as user "postgresql".
184        test=# create extension hstore;
185        CREATE EXTENSION
186
187    Full-text search configuration should be set to English, else
188    several tests of ``.match()`` will fail. This can be set (if it isn't so
189    already) with:
190
191     ALTER DATABASE test SET default_text_search_config = 'pg_catalog.english'
192
193    For two-phase transaction support, the max_prepared_transactions
194    configuration variable must be set to a non-zero value in postgresql.conf.
195    See
196    https://www.postgresql.org/docs/current/runtime-config-resource.html#GUC-MAX-PREPARED-TRANSACTIONS
197    for further background.
198
199    ORACLE: a user named "test_schema" is created in addition to the default
200    user.
201
202    The primary database user needs to be able to create and drop tables,
203    synonyms, and constraints within the "test_schema" user.   For this
204    to work fully, including that the user has the "REFERENCES" role
205    in a remote schema for tables not yet defined (REFERENCES is per-table),
206    it is required that the test the user be present in the "DBA" role:
207
208        grant dba to scott;
209
210    MSSQL: Tests that involve multiple connections require Snapshot Isolation
211    ability implemented on the test database in order to prevent deadlocks that
212    will occur with record locking isolation. This feature is only available
213    with MSSQL 2005 and greater. You must enable snapshot isolation at the
214    database level and set the default cursor isolation with two SQL commands:
215
216     ALTER DATABASE MyDatabase SET ALLOW_SNAPSHOT_ISOLATION ON
217
218     ALTER DATABASE MyDatabase SET READ_COMMITTED_SNAPSHOT ON
219
220Docker Configurations
221---------------------
222
223The SQLAlchemy test can run against database running in Docker containers.
224This ensures that they are empty and that their configuration is not influenced
225by any local usage.
226
227The following configurations are just examples that developers can use to
228quickly set up a local environment for SQLAlchemy development. They are **NOT**
229intended for production use!
230
231**PostgreSQL configuration**::
232
233    # create the container with the proper configuration for sqlalchemy
234    docker run --rm -e POSTGRES_USER='scott' -e POSTGRES_PASSWORD='tiger' -e POSTGRES_DB='test' -p 127.0.0.1:5432:5432 -d --name postgres postgres
235
236    # configure the database
237    sleep 10
238    docker exec -ti postgres psql -U scott -c 'CREATE SCHEMA test_schema; CREATE SCHEMA test_schema_2;' test
239    # this last command is optional
240    docker exec -ti postgres sed -i 's/#max_prepared_transactions = 0/max_prepared_transactions = 10/g' /var/lib/postgresql/data/postgresql.conf
241
242    # To stop the container. It will also remove it.
243    docker stop postgres
244
245**MySQL configuration**::
246
247    # create the container with the proper configuration for sqlalchemy
248    docker run --rm -e MYSQL_USER='scott' -e MYSQL_PASSWORD='tiger' -e MYSQL_DATABASE='test' -e MYSQL_ROOT_PASSWORD='password' -p 127.0.0.1:3306:3306 -d --name mysql mysql --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
249
250    # configure the database
251    sleep 20
252    docker exec -ti mysql mysql -u root -ppassword -w -e "CREATE DATABASE test_schema CHARSET utf8mb4; GRANT ALL ON test_schema.* TO scott;"
253
254    # To stop the container. It will also remove it.
255    docker stop mysql
256
257**MariaDB configuration**::
258
259    # create the container with the proper configuration for sqlalchemy
260    docker run --rm -e MARIADB_USER='scott' -e MARIADB_PASSWORD='tiger' -e MARIADB_DATABASE='test' -e MARIADB_ROOT_PASSWORD='password' -p 127.0.0.1:3306:3306 -d --name mariadb mariadb --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
261
262    # configure the database
263    sleep 20
264    docker exec -ti mariadb mysql -u root -ppassword -w -e "CREATE DATABASE test_schema CHARSET utf8mb4; GRANT ALL ON test_schema.* TO scott;"
265
266    # To stop the container. It will also remove it.
267    docker stop mariadb
268
269**MSSQL configuration**::
270
271    # create the container with the proper configuration for sqlalchemy
272    # it will use the Developer version
273    docker run --rm -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -p 127.0.0.1:1433:1433 -d --name mssql mcr.microsoft.com/mssql/server
274
275    # configure the database
276    sleep 20
277    docker exec -it mssql /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'yourStrong(!)Password' -Q "sp_configure 'contained database authentication', 1; RECONFIGURE; CREATE DATABASE test CONTAINMENT = PARTIAL; ALTER DATABASE test SET ALLOW_SNAPSHOT_ISOLATION ON; ALTER DATABASE test SET READ_COMMITTED_SNAPSHOT ON; CREATE LOGIN scott WITH PASSWORD = 'tiger^5HHH'; ALTER SERVER ROLE sysadmin ADD MEMBER scott;"
278    docker exec -it mssql /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'yourStrong(!)Password' -d test -Q "CREATE SCHEMA test_schema"
279    docker exec -it mssql /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'yourStrong(!)Password' -d test -Q "CREATE SCHEMA test_schema_2"
280
281    # To stop the container. It will also remove it.
282    docker stop mssql
283
284NOTE: with this configuration the url to use is not the default one configured
285in setup, but ``mssql+pymssql://scott:tiger^5HHH@127.0.0.1:1433/test``.  It can
286be used with pytest by using ``--db docker_mssql``.
287
288CONFIGURING LOGGING
289-------------------
290SQLAlchemy logs its activity and debugging through Python's logging package.
291Any log target can be directed to the console with command line options, such
292as::
293
294    $ ./pytest test/orm/test_unitofwork.py -s \
295      --log-debug=sqlalchemy.pool --log-info=sqlalchemy.engine
296
297Above we add the pytest "-s" flag so that standard out is not suppressed.
298
299
300DEVELOPING AND TESTING NEW DIALECTS
301-----------------------------------
302
303See the file README.dialects.rst for detail on dialects.
304
305
306