This API will require much less consuming code to change to accomodate
the removal of automatic type casting from Arel. As long as the
predicates are constructed using the `arel_table` off of an AR subclass,
there will be no changes that need to happen.
We only support classes which provide a no-args constructor to use as a
default value. We can provide a more helpful error message if we catch
this when `serialize` is called, rather than letting it error when you
try to assign the attribute.
Fixes#18224
Most of the documentation very closely mirrors the matching
docs from `SchemaStatements`. I reduced duplicated copy and
added links to the underlying methods for the user to follow.
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
Part of the larger refactoring to remove type casting from Arel. We can
inform it that we already have the right type by wrapping the value in
an `Arel::Nodes::Quoted`. This commit can be reverted when we have
removed type casting from Arel in Rail 5.1
There are several valid cases where right now we can't determine the
association's class in a call to `where`. In these cases, we can fall
back to casting by looking up the column from the connection adapter
(which is what happens right now when we fall through to Arel)
This is ugly, and since we're trying to separate the concept of a type
from a column, I'd like to remove it in the future. The problem
basically comes down to this:
Liquid.joins(molecules: :electrons)
.where("molecules.name" => "something", "electrons.name" => "something")
The hash in this case will turn into:
{
molecules: { name: "something" },
electrons: { name: "something" },
}
What we actually need is:
{
molecules: {
name: "something",
electrons: { name: "something" },
}
}
/cc @mrgilman
This code could use some much heavier refactoring. It looks like
`build_relation` duplicates most of the logic of `Relation#where` and
`PredicateBuilder` with regards to handling associations and attribute
aliases
Part of the larger refactoring to remove type casting from Arel. Since
we've already cast the value a few lines above, we don't need to re-cast
it later. We can inform Arel of this by wrapping it in an
`Arel::Nodes::Quoted`, which will no longer be required in Rails 5.1
We will always have the correct type for this query, so no casting is
needed. We inform Arel that we already have the right type by wrapping
it in an `Arel::Nodes::Quoted` (which we will no longer need to do in
Rails 5.1)
This will allow eager type casting to take place as needed. There
doesn't seem to be any particular reason that the `in` statement was
forced for single values, and the commit message where it was introduced
gives no context.
See
d90b4e2615
A custom object is required for this, as you cannot build a range
object out of `Arel::Nodes::Quoted` objects. Depends on the changes
introduced in
cf03bd45e3
/cc @mrgilman
As part of the larger refactoring to remove type casting from Arel, we
need to do the casting of values eagerly. The predicate builder is the
closest place that knows about the Active Record class, and can
therefore have the type information.
/cc @mrgilman
[Sean Griffin & Melanie Gilman]
This class cares far too much about the internals of other parts of
Active Record. This is an attempt to break out a meaningful object which
represents the needs of the predicate builder. I'm not fully satisfied
with the name, but the general concept is an object which represents a
table, the associations to/from that table, and the types associated
with it. Many of these exist at the `ActiveRecord::Base` class level,
not as properties of the table itself, hence the need for another
object. Currently it provides these by holding a reference to the class,
but that will likely change in the future. This allows the predicate
builder to remain wholy concerned with building predicates.
/cc @mrgilman
I'm attempting to remove `klass` as a dependency of the predicate
builder, in favor of an object that better represents what we're using
it for. The only part of this which doesn't fit nicely into that picture
is the check for an association being polymorphic. Since I'm not yet
sure what that is going to look like, I've moved this logic into another
class in an attempt to separate things that will change from things that
won't.
This reduces the number of places which will need to care about single
value or range specific logic as we introduce type casting. The array
handler is only responsible for producing `in` statements.
/cc @mrgilman
[Sean Griffin & Melanie Gilman]
This will allow us to pass the predicate builder into the constructor of
these handlers. The procs had to be changed to objects, because the
`PredicateBuilder` needs to be marshalable. If we ever decide to make
`register_handler` part of the public API, we should come up with a
better solution which allows procs.
/cc @mrgilman
[Sean Griffin & Melanie Gilman]
Construction of relations can be a hotspot, we don't want to create one
of these in the constructor. This also allows us to do more expensive
things in the predicate builder's constructor, since it's created once
per AR::Base subclass
I think we should deprecate this behavior and just error if you tell us
to do a case insensitive comparison for types which are not case
sensitive. Partially reverts 35592307Fixes#18195
The way Active Record query methods handle numeric values is a special case, and is not part of Rails's standard definition of present. This update attempts to make this more clear in the docs, so that people don't expect Object#present? to return false if used on a number that is zero.
We were ignoring the `default_value?` escape clause in the serialized
type, which caused the default value to always be treated as changed.
Fixes#18169
The code for `TableDefinition#references` and
`SchemaStatements#add_reference` were almost identical both
structurally, and in terms of domain knowledge. This removes that
duplication into a common class, using the `Table` API as the expected
interface of its collaborator.
The code in `ConnectionPool#release` assumed that a single thread only
ever holds a single connection, and thus that releasing a connection
only requires the owning thread_id.
There is a trivial counterexample to this assumption: code that checks
out additional connections from the pool in the same thread. For
instance:
connection_1 = ActiveRecord::Base.connection
connection_2 = ActiveRecord::Base.connection_pool.checkout
ActiveRecord::Base.connection_pool.checkin(connection_2)
connection_3 = ActiveRecord::Base.connection
At this point, connection_1 has been removed from the
`@reserved_connections` hash, causing a NEW connection to be returned as
connection_3 and the loss of any tracking info on connection_1. As long
as the thread in this example lives, connection_1 will be inaccessible
and un-reapable. If this block of code runs more times than the size of
the connection pool in a single thread, every subsequent connection
attempt will timeout, as all of the available connections have been
leaked.
Reverts parts of 9e457a8654 and
essentially all of 4367d2f05c
I just wasted an absurd amount of time trying to figure out why my model
wasn't being deleted even though I was setting `_destroy` to true like
the instructions said. Making the documentation a little bit clear so
that someone like me doesn't waste their time in future.
Update Active Record's attribute query methods documentation to clarify that whether an attribute is present is based on Object#present?. This gives people a place to go see what the exact definition of presence is. [ci skip]
This bug occurs when an attribute of an ActiveRecord model is an
ActiveRecord::Type::Integer type or a ActiveRecord::Type::Decimal type (or any
other type that includes the ActiveRecord::Type::Numeric module. When the value
of the attribute is negative and is set to the same negative value, it is marked
as changed.
Take the following example of a Person model with the integer attribute age:
class Person < ActiveRecord::Base
# age :integer(4)
end
The following will produce the error:
person = Person.new(age: -1)
person.age = -1
person.changes
=> { "age" => [-1, -1] }
person.age_changed?
=> true
The problematic line is here:
module ActiveRecord
module Type
module Numeric
...
def non_numeric_string?(value)
# 'wibble'.to_i will give zero, we want to make sure
# that we aren't marking int zero to string zero as
# changed.
value.to_s !~ /\A\d+\.?\d*\z/
end
end
end
end
The regex match doesn't accept numbers with a leading '-'.
`ActiveRecord::Base#[]` has overhead that was introduced in 4.2. The
`foo["id"]` working with PKs other than ID isn't really a case that we
want to support publicly, but deprecating was painful enough that we
avoid it. `_read_attribute` was introduced as the faster alternative for
use internally. By using that, we can save a lot of overhead. We also
save some overhead by reading the attribute one fewer times in
`stale_state`.
Fixes#18151
If there is a method defined such as `find_and_do_stuff(id)`, which then
gets called on an association, we will perform statement caching and the
parent ID will not change on subsequent calls.
Fixes#18117
Calling `changed_attributes` will ultimately check if every mutable
attribute has changed in place. Since this gets called whenever an
attribute is assigned, it's extremely slow. Instead, we can avoid this
calculation until we actually need it.
Fixes#18029
Changes `rails g model Post user:references` from
def change
create_table :posts do |t|
t.references :user, index: true
end
add_foreign_key :posts, :users
end
to
def change
create_table :posts do |t|
t.references :user, index: true, foreign_key: true
end
end
Changes `rails g migration add_user_to_posts user:references` from
def change
add_reference :posts, :users, index: true
add_foreign_key :posts, :users
end
to
def change
add_reference :posts, :users, index: true, foreign_key: true
end
This has the same comments as 9af90ffa00ba35bdee888e3e1ab775ba0bdbe72c,
however it affects the `add_reference` method, and `t.references` in the
context of a `change_table` block.
There is a lot of duplication of code between creating and updating
tables. We should re-evaluate the structure of this code from a high
level so changes like this don't need to be made in two places. (Note to
self)
Rather than having to do:
create_table :posts do |t|
t.references :user
end
add_foreign_key :posts, :users
You can instead do:
create_table :posts do |t|
t.references :user, foreign_key: true
end
Similar to the `index` option, you can also pass a hash. This will be
passed as the options to `add_foreign_key`. e.g.:
create_table :posts do |t|
t.references :user, foreign_key: { primary_key: :other_id }
end
is equivalent to
create_table :posts do |t|
t.references :user
end
add_foreign_key :posts, :users, primary_key: :other_id
If the test is interrupted in a way that the teardown block fails to
run, the tests will fail to run until the table is removed manually
without this option.
PG doesn't register it's types using the `int(4)` format that others do.
As such, if we alias `int8` to the other integer types, the range
information is lost. This is fixed by simply registering it separately.
The other option (which I specifically chose to avoid) is to pass the
information of the original type that was being aliased as an argument.
I'd rather avoid that, since an alias should truly be treated the same.
If we need different behavior for a different type, we should explicitly
register it with that, and not have a conditional based on aliasing.
Fixes#18144
[Sean Griffin & ysbaddaden]
1. Specify that you need to create the test databases, and that no special
Rails command needs to be run to do that.
2. Although the underscore style of `rake test_mysql` works, make the
documentation of running the tests in RUNNING_UNIT_TESTS.rdoc
consistent with the "Contributing..." guide.
3. Promote "Testing Active Record" to not be a subsection of
"Running a Single Test," since it doesn't make sense as a subsection
of that.
- Right now, there is no method to update multiple records with
validations and callbacks.
- Changed the behavior of existing `update` method so that when `id`
attribute is not given and the method is called on an `Relation`
object, it will execute update for every record of the `Relation` and
will run validations and callbacks for every record.
- Added test case for validating that the callbacks run when `update` is
called on a `Relation`.
- Changed test_create_columns_not_equal_attributes test from
persistence_test to include author_name column on topics table as it
it used in before_update callback.
- This change introduces performance issues when a large number of
records are to be updated because it runs UPDATE query for every
record of the result. The `update_all` method can be used in that case
if callbacks are not required because it will only run single UPDATE
for all the records.
I find that `Specifies the starting point for the batch processing.`
does not give enough information for me to understand what this
parameter actually does.
Closes#17945
`db:test:prepare` still purges the database to always keep the test
database in a consistent state.
This patch introduces new problems with `db:schema:load`. Prior
to the introduction of foreign-keys, we could run this file against
a non-empty database. Since every `create_table` containted the
`force: true` option, this would recreate tables when loading the schema.
However with foreign-keys in place, `force: true` wont work anymore and
the task will crash.
/cc @schneems
Apparently PG does not validate against RFC 4122. The intent of the original
patch is just to protect against PG errors (which potentially breaks txns, etc)
because of bad user input, so we shouldn't try any harder than PG itself.
Closes#17931
When using DATABASE_URL to configure ActiveRecord, :reaping_frequency
does not get converted from a string to a numeric value. This value is
eventually passed to 'sleep' and must be numeric to avoid exceptions.
This commit converts :reaping_frequency to a float when present.
In the case of serialized columns, we would expect the unserialized
value as input, not the serialized value. The original issue which made
this distinction, #14163, introduced a bug. If you passed serialized
input to the method, it would double serialize when it was sent to the
database. You would see the wrong input upon reloading, or get an error
if you had a specific type on the serialized column.
To put it another way, `update_column` is a special case of
`update_all`, which would take `['a']` and not `['a'].to_yaml`, but you
would not pass data from `params` to it.
Fixes#18037
Because we're only using the `connection` so passing the entire tracker
isn't unnecessary.
Eventually only the `connection` will be passed to `add_constraints`
with later refactoring but curretly that's not possible because of
`construct_tables` method.
PostgreSQL for example, allows infinity as a valid value for date time
columns. The PG type has explicit handling for that case. However, time
zone conversion will end up trampling that handling. Unfortunately, we
can't call super and then convert time zones.
However, if we get back nil from `.in_time_zone`, it's something we
didn't expect so we can let the superclass handle it.
Fixes#17971
To be possible to use a custom column name to save/read the polymorphic
associated type in a has_many or has_one polymorphic association, now users
can use the option :foreign_type to inform in what column the associated object
type will be saved.
`freeze` will ultimately end up freezing the `AttributeSet`, which in
turn freezes its `@attributes` hash. However, we actually insert a
special object to lazily instantiate the values of the hash on demand.
When it does need to actually instantiate all of them for iteration (the
only case is `ActiveRecord::Base#attributes`, which calls
`AttributeSet#to_h`), it will set an instance variable as a performance
optimization
Since it's just an optimization for subsequent calls, and that method
being called at all is a very uncommon case, we can just leave the ivar
alone if we're frozen, as opposed to coming up with some overly
complicated mechanism for freezing which allows us to continue to modify
ourselves.
Fixes#17960
The user is able to pass PG string literals in 4.1, and have it
converted to an array. This is also possible in 4.2, but it would remain
in string form until saving and reloading, which breaks our
`attr = save.reload.attr` contract. I think we should deprecate this in
5.0, and only allow array input from user sources. However, this
currently constitutes a breaking change to public API that did not go
through a deprecation cycle.
The validation added in 5a3dc8092d will
reject values for the `:on` option for after_commit and after_rollback
callbacks that are string values like `"create"`.
However, the error message says ":on conditions for after_commit and
after_rollback callbacks have to be one of create,destroy,update". That
looks like a string value *would* be valid.
This commit changes the error message to say ":on conditions for
after_commit and after_rollback callbacks have to be one of [:create,
:destroy, :update]", making it clearer that symbols are required.
If the tests are interupted and the teardown block doesn't run, the
developer needs to delete these manually in order to be able to run the
tests again.
The type registration was simply looking for the OID, and eagerly
fetching/constructing the sub type when it was registered. However,
numeric types have additional parameters which are extracted from the
actual SQL string of the type during lookup, and can have their behavior
change based on the result.
We simply need to use the block form of registration, and look up the
subtype lazily instead.
Fixes#17935
Active Record defines `attribute_method_suffix :?`. That suffix will
match any predicate method when the lookup occurs in Active Model. This
will make it incorrectly decide that `id_changed?` should not exist,
because it attempts to determine if the attribute `id_changed` is
present, rather than `id` with the `_changed?` suffix. Instead, we will
look for any correct match.
All of the behavior :environment was giving (that db:schema:load needed)
was provided as well with :load_config.
This will address an issue introduced in
https://github.com/rails/rails/pull/15394. The fact that db:schema:load
now drops and creates the database causes the Octopus gem to have [an
issue](https://github.com/tchandy/octopus/issues/273) during the drop
step for the test database (which wasn't happening in db:schema:load
before). The error looks like:
ActiveRecord::StatementInvalid: PG::ObjectInUse: ERROR: cannot drop the currently open database
: DROP DATABASE IF EXISTS "app_test"
Because of the timing, this issue is present in master, 4-2-*, and
4.1.8.
A note to forlorn developers who might see this: "Additionally" in a
commit message means you should have a separate commit, with a separate
justification for changes. Small commits with big messages are your
friends.
This `# :nodoc:` had the effect of hiding every method that follows.
This meant that the API page for `ActiveRecord::Core` only contained
`configurations` and none of the following methods.
Furthermore this `# :nodoc:` had no effect on `maintain_test_schema`.
Those `mattr_accessor` inside the `included` block are not picked up
by rdoc.
/cc @zzak
When running the following migration:
change_table(:table_name) { |t| t/timestamps }
The following error was produced:
wrong number of arguments (2 for 1) .... /connection_adapters/abstract/schema_statements.rb:851:in `remove_timestamps'
This is due to `arguments` containing an empty hash as its second
argument.
[ci skip]
This is due to the fact that `.delete` is directly translated to SQL.
It tries to follow the same rules as `.delete_all` which is not able
to verify that records are `#readonly?`.
This ensures that we're handling all forms of nested tables the same way.
We're aware that the `convert_dot_notation_to_hash` method will cause a
performance hit, and we intend to come back to it once we've refactored some of
the surrounding code.
[Melissa Xie & Melanie Gilman]
The detection of in-place changes caused a weird unexpected issue with
numericality validations. That validator (out of necessity) works on the
`_before_type_cast` version of the attribute, since on an `:integer`
type column, a non-numeric string would type cast to 0.
However, strings are mutable, and we changed strings to ensure that the
post type cast version of the attribute was a different instance than
the before type cast version (so the mutation detection can work
properly).
Even though strings are the only mutable type for which a numericality
validation makes sense, special casing strings would feel like a strange
change to make here. Instead, we can make the assumption that for all
mutable types, we should work on the post-type-cast version of the
attribute, since all cases which would return 0 for non-numeric strings
are immutable.
Fixes#17852
We never actually make use of it on the table, since we're constructing
the select manager manually. It looks like if we ever actually were
grabbing it from the table, we're grossly misusing it since it's meant
to vary by AR class.
Its existence on `Arel::Table` appears to be purely for convenience
methods that are never used outside of tests. However, in production
code it just complicates construction of the tables on the rails side,
and the plan is to remove it from `Arel::Table` entirely. I'm not
convinced it needs to live on `SelectManager`, etc either.
This fixes a regression where preloading association throws an
exception if one of the associations in the preloading hash doesn't
exist for one record.
Fixes#16070
Since 3e30c5d, it started ignoring the given error message. This commit
changes the behavior of AR::RecordNotSaved#initialize so that it no
longer loses the given error message.
If you run a generator such as:
```
rails generate model accounts supplier:references
```
The resulting migration will now add the corresponding foreign key
constraint unless the reference was specified to be polymorphic.
The records weren't being replaced since equality in Active Record is
defined in terms of `id` only. It is reasonable to expect that the
references would be replaced in memory, even if no queries are actually
executed. This change did not appear to affect any other parts of the
code base. I chose not to execute callbacks since we're not actually
modifying the association in a way that will be persisted.
Fixes#17730
This reverts deprecations added in #13528.
The task is brought back for two reasons:
1. Give plugins a way to hook into the test database initialization process
2. Give the user a way to force a test database synchronization
While `test:prepare` is still a dependency of every test task, `db:test:prepare`
no longer hooks into it. This means that `test:prepare` runs before the schema
is synchronized. Plugins, which insert data can now hook into `db:test:prepare`.
The automatic schema maintenance can't detect when a migration is rolled-back,
modified and reapplied. In this case the user has to fall back to `db:test:prepare`
to force the synchronization to happen.
[Rafael Mendonça França & Yves Senn]
This require caused the `active_record.set_configs` initializer to
run immediately, before `config/initializers`. This means that setting any
configuration on `Rails.application.config.active_record` inside of
an initializer had no effects when rails was loaded through `rake`.
Introduced by #6518
/cc @rafaelfranca
This refactoring reduces the number of conditionals needed to build
`aliased_table_for` and removes `aliased_name_for` because it's no
longer necessary.
`aliased_name_for` was also used in `JoinDependency#initialize` so
that was replaced with `aliased_table_for` as well.
This allows us so abstract the migration from the type that is actually
used by Rails. For example, ":string" may be a varchar or something,
but the framework does that translation, and the app shouldn't need to
know.
It now contains a carefully formulated reference to the "current relation" which might help clarify that the receiving will generate its own scope, escaping the need for explicitly referencing `default_scope` which is, after all, just another way of specifying a scope and nothing special.