There's no point in indexing the file if it hasn't been synced to
disk. Any attempts to index the file will fail if the file still has an
open data block.
If MaxScale is started without the appropriate permissions to the paths
pointed by default values, the debug assertion fails even though the
parameter is valid but not usable.
The random number generator can be initialized when MaxScale's other
systems are being initialized. This removes the need to initialized it
when the function is used for the first time.
NullFilter is a filter module that does nothing, except reports
capabilities as defined in the configuration file. It's purpose
is only to make it simple to benchmark the performance impact
various routing capabilities have.
Note that since getCapabilities() currently does *not* take an
instance pointer as parameter, all NullFilter instances will
report the same capabilities, the ones specified for the last
filter to have been loaded.
Removed unused spinlocks from DCBs, sessions and the MySQL protocol
structs. They were used in a context where only one thread has access to
the structure.
Removed unused member variables from DCBs.
The debug assertion assumes that the table definition is always in the
binlogs. If a binlog row event without a table definition is read, debug
versions would crash even though the situation is acceptable and expected.
In a configuration with multiple services, one with connection_timeout and
others without it, the connections to non-connection_timeout services
would get immediately closed due to integer overflow.
The backtick was copied to the field name and converted to an underscore
when the name was transformed into a valid Avro identifier. This caused
one extra character to appear in the field name in the Avro schema files.
As the DCBs are "owned" by threads and are handled without locks, no
cross-thread access to those DCBs should be done. Due to this, the show
persistent command for maxadmin has to be changed to show only the size of
the pool.
For instance, if bob is returned an error because he does not have
the required grants, then if the error were cached, alice would
receive bob's error reply even if she has the required grants.
- snake_case
- member variables prefixed with m_
- static member variables prefixed with s_
- where prefixes are used (z,p,s), the following character
is capitalized
As the cdc_kafka_producer script is an example, it should flush the
producer after every new record. This should make it easier to see that
events from MaxScale are sent to Kafka.
The firewall filter should allow COM_PING and other similar commands to
pass through as they are mainly used to check the status of the backend
server or to display statistics. The COM_PROCESS_KILL is the exception as
it affects the state of the backend server. This is better controlled with
permissions in the server than in the firewall filter.
Commands that require special grants aren't allowed to pass as they are
mainly for maintenance purposes and these should not be done through the
firewall.
There's no need to process the JSON twice as the Kafka producer is
expected to be used with the Python CDC client which already splits the
JSON with newlines.