Django save model error

The web framework for perfectionists with deadlines.

Model instance reference¶

This document describes the details of the Model API. It builds on the
material presented in the model and database
query
guides, so you’ll probably want to read and
understand those documents before reading this one.

Throughout this reference we’ll use the example blog models presented in the database query guide.

Creating objects¶

To create a new instance of a model, instantiate it like any other Python
class:

class Model(**kwargs

The keyword arguments are the names of the fields you’ve defined on your model.
Note that instantiating a model in no way touches your database; for that, you
need to save().

Note

You may be tempted to customize the model by overriding the __init__
method. If you do so, however, take care not to change the calling
signature as any change may prevent the model instance from being saved.
Rather than overriding __init__, try using one of these approaches:

  1. Add a classmethod on the model class:

    from django.db import models
    
    class Book(models.Model):
        title = models.CharField(max_length=100)
    
        @classmethod
        def create(cls, title):
            book = cls(title=title)
            # do something with the book
            return book
    
    book = Book.create("Pride and Prejudice")
    
  2. Add a method on a custom manager (usually preferred):

    class BookManager(models.Manager):
        def create_book(self, title):
            book = self.create(title=title)
            # do something with the book
            return book
    
    class Book(models.Model):
        title = models.CharField(max_length=100)
    
        objects = BookManager()
    
    book = Book.objects.create_book("Pride and Prejudice")
    

Customizing model loading¶

classmethod Model.from_db(db, field_names, values

The from_db() method can be used to customize model instance creation
when loading from the database.

The db argument contains the database alias for the database the model
is loaded from, field_names contains the names of all loaded fields, and
values contains the loaded values for each field in field_names. The
field_names are in the same order as the values. If all of the model’s
fields are present, then values are guaranteed to be in the order
__init__() expects them. That is, the instance can be created by
cls(*values). If any fields are deferred, they won’t appear in
field_names. In that case, assign a value of django.db.models.DEFERRED
to each of the missing fields.

In addition to creating the new model, the from_db() method must set the
adding and db flags in the new instance’s _state attribute.

Below is an example showing how to record the initial values of fields that
are loaded from the database:

from django.db.models import DEFERRED

@classmethod
def from_db(cls, db, field_names, values):
    # Default implementation of from_db() (subject to change and could
    # be replaced with super()).
    if len(values) != len(cls._meta.concrete_fields):
        values = list(values)
        values.reverse()
        values = [
            values.pop() if f.attname in field_names else DEFERRED
            for f in cls._meta.concrete_fields
        ]
    instance = cls(*values)
    instance._state.adding = False
    instance._state.db = db
    # customization to store the original field values on the instance
    instance._loaded_values = dict(
        zip(field_names, (value for value in values if value is not DEFERRED))
    )
    return instance

def save(self, *args, **kwargs):
    # Check how the current values differ from ._loaded_values. For example,
    # prevent changing the creator_id of the model. (This example doesn't
    # support cases where 'creator_id' is deferred).
    if not self._state.adding and (
            self.creator_id != self._loaded_values['creator_id']):
        raise ValueError("Updating the value of creator isn't allowed")
    super().save(*args, **kwargs)

The example above shows a full from_db() implementation to clarify how that
is done. In this case it would be possible to use a super() call in the
from_db() method.

Refreshing objects from database¶

If you delete a field from a model instance, accessing it again reloads the
value from the database:

>>> obj = MyModel.objects.first()
>>> del obj.field
>>> obj.field  # Loads the field from the database
Model.refresh_from_db(using=None, fields=None

If you need to reload a model’s values from the database, you can use the
refresh_from_db() method. When this method is called without arguments the
following is done:

  1. All non-deferred fields of the model are updated to the values currently
    present in the database.
  2. Any cached relations are cleared from the reloaded instance.

Only fields of the model are reloaded from the database. Other
database-dependent values such as annotations aren’t reloaded. Any
@cached_property attributes
aren’t cleared either.

The reloading happens from the database the instance was loaded from, or from
the default database if the instance wasn’t loaded from the database. The
using argument can be used to force the database used for reloading.

It is possible to force the set of fields to be loaded by using the fields
argument.

For example, to test that an update() call resulted in the expected
update, you could write a test similar to this:

def test_update_result(self):
    obj = MyModel.objects.create(val=1)
    MyModel.objects.filter(pk=obj.pk).update(val=F('val') + 1)
    # At this point obj.val is still 1, but the value in the database
    # was updated to 2. The object's updated value needs to be reloaded
    # from the database.
    obj.refresh_from_db()
    self.assertEqual(obj.val, 2)

Note that when deferred fields are accessed, the loading of the deferred
field’s value happens through this method. Thus it is possible to customize
the way deferred loading happens. The example below shows how one can reload
all of the instance’s fields when a deferred field is reloaded:

class ExampleModel(models.Model):
    def refresh_from_db(self, using=None, fields=None, **kwargs):
        # fields contains the name of the deferred field to be
        # loaded.
        if fields is not None:
            fields = set(fields)
            deferred_fields = self.get_deferred_fields()
            # If any deferred field is going to be loaded
            if fields.intersection(deferred_fields):
                # then load all of them
                fields = fields.union(deferred_fields)
        super().refresh_from_db(using, fields, **kwargs)
Model.get_deferred_fields()¶

A helper method that returns a set containing the attribute names of all those
fields that are currently deferred for this model.

Validating objects¶

There are four steps involved in validating a model:

  1. Validate the model fields — Model.clean_fields()
  2. Validate the model as a whole — Model.clean()
  3. Validate the field uniqueness — Model.validate_unique()
  4. Validate the constraints — Model.validate_constraints()

All four steps are performed when you call a model’s full_clean()
method.

When you use a ModelForm, the call to
is_valid() will perform these validation steps for
all the fields that are included on the form. See the ModelForm
documentation
for more information. You should only
need to call a model’s full_clean() method if you plan to handle
validation errors yourself, or if you have excluded fields from the
ModelForm that require validation.

Warning

Constraints containing JSONField may not raise
validation errors as key, index, and path transforms have many
database-specific caveats. This may be fully supported later.

You should always check that there are no log messages, in the
django.db.models logger, like “Got a database error calling check() on
…”
to confirm it’s validated properly.

Changed in Django 4.1:

In older versions, constraints were not checked during the model
validation.

Model.full_clean(exclude=None, validate_unique=True, validate_constraints=True

This method calls Model.clean_fields(), Model.clean(),
Model.validate_unique() (if validate_unique is True), and
Model.validate_constraints() (if validate_constraints is True)
in that order and raises a ValidationError that
has a message_dict attribute containing errors from all four stages.

The optional exclude argument can be used to provide a set of field
names that can be excluded from validation and cleaning.
ModelForm uses this argument to exclude fields that
aren’t present on your form from being validated since any errors raised could
not be corrected by the user.

Note that full_clean() will not be called automatically when you call
your model’s save() method. You’ll need to call it manually
when you want to run one-step model validation for your own manually created
models. For example:

from django.core.exceptions import ValidationError
try:
    article.full_clean()
except ValidationError as e:
    # Do something based on the errors contained in e.message_dict.
    # Display them to a user, or handle them programmatically.
    pass

The first step full_clean() performs is to clean each individual field.

Changed in Django 4.1:

The validate_constraints argument was added.

Changed in Django 4.1:

An exclude value is now converted to a set rather than a list.

Model.clean_fields(exclude=None

This method will validate all fields on your model. The optional exclude
argument lets you provide a set of field names to exclude from validation.
It will raise a ValidationError if any fields
fail validation.

The second step full_clean() performs is to call Model.clean().
This method should be overridden to perform custom validation on your model.

Model.clean()¶

This method should be used to provide custom model validation, and to modify
attributes on your model if desired. For instance, you could use it to
automatically provide a value for a field, or to do validation that requires
access to more than a single field:

import datetime
from django.core.exceptions import ValidationError
from django.db import models
from django.utils.translation import gettext_lazy as _

class Article(models.Model):
    ...
    def clean(self):
        # Don't allow draft entries to have a pub_date.
        if self.status == 'draft' and self.pub_date is not None:
            raise ValidationError(_('Draft entries may not have a publication date.'))
        # Set the pub_date for published items if it hasn't been set already.
        if self.status == 'published' and self.pub_date is None:
            self.pub_date = datetime.date.today()

Note, however, that like Model.full_clean(), a model’s clean()
method is not invoked when you call your model’s save() method.

In the above example, the ValidationError
exception raised by Model.clean() was instantiated with a string, so it
will be stored in a special error dictionary key,
NON_FIELD_ERRORS. This key is used for errors
that are tied to the entire model instead of to a specific field:

from django.core.exceptions import NON_FIELD_ERRORS, ValidationError
try:
    article.full_clean()
except ValidationError as e:
    non_field_errors = e.message_dict[NON_FIELD_ERRORS]

To assign exceptions to a specific field, instantiate the
ValidationError with a dictionary, where the
keys are the field names. We could update the previous example to assign the
error to the pub_date field:

class Article(models.Model):
    ...
    def clean(self):
        # Don't allow draft entries to have a pub_date.
        if self.status == 'draft' and self.pub_date is not None:
            raise ValidationError({'pub_date': _('Draft entries may not have a publication date.')})
        ...

If you detect errors in multiple fields during Model.clean(), you can also
pass a dictionary mapping field names to errors:

raise ValidationError({
    'title': ValidationError(_('Missing title.'), code='required'),
    'pub_date': ValidationError(_('Invalid date.'), code='invalid'),
})

Then, full_clean() will check unique constraints on your model.

How to raise field-specific validation errors if those fields don’t appear in a ModelForm

You can’t raise validation errors in Model.clean() for fields that
don’t appear in a model form (a form may limit its fields using
Meta.fields or Meta.exclude). Doing so will raise a ValueError
because the validation error won’t be able to be associated with the
excluded field.

To work around this dilemma, instead override Model.clean_fields() as it receives the list of fields
that are excluded from validation. For example:

class Article(models.Model):
    ...
    def clean_fields(self, exclude=None):
        super().clean_fields(exclude=exclude)
        if self.status == 'draft' and self.pub_date is not None:
            if exclude and 'status' in exclude:
                raise ValidationError(
                    _('Draft entries may not have a publication date.')
                )
            else:
                raise ValidationError({
                    'status': _(
                        'Set status to draft if there is not a '
                        'publication date.'
                     ),
                })
Model.validate_unique(exclude=None

This method is similar to clean_fields(), but validates
uniqueness constraints defined via Field.unique,
Field.unique_for_date, Field.unique_for_month,
Field.unique_for_year, or Meta.unique_together on your model instead of individual
field values. The optional exclude argument allows you to provide a set
of field names to exclude from validation. It will raise a
ValidationError if any fields fail validation.

UniqueConstraints defined in the
Meta.constraints are validated
by Model.validate_constraints().

Note that if you provide an exclude argument to validate_unique(), any
unique_together constraint involving one of
the fields you provided will not be checked.

Finally, full_clean() will check any other constraints on your model.

Changed in Django 4.1:

In older versions, UniqueConstraints were
validated by validate_unique().

Model.validate_constraints(exclude=None

New in Django 4.1.

This method validates all constraints defined in
Meta.constraints. The
optional exclude argument allows you to provide a set of field names to
exclude from validation. It will raise a
ValidationError if any constraints fail
validation.

Saving objects¶

To save an object back to the database, call save():

Model.save(force_insert=False, force_update=False, using=DEFAULT_DB_ALIAS, update_fields=None

For details on using the force_insert and force_update arguments, see
Forcing an INSERT or UPDATE. Details about the update_fields argument
can be found in the Specifying which fields to save section.

If you want customized saving behavior, you can override this save()
method. See Overriding predefined model methods for more details.

The model save process also has some subtleties; see the sections below.

Auto-incrementing primary keys¶

If a model has an AutoField — an auto-incrementing
primary key — then that auto-incremented value will be calculated and saved as
an attribute on your object the first time you call save():

>>> b2 = Blog(name='Cheddar Talk', tagline='Thoughts on cheese.')
>>> b2.id     # Returns None, because b2 doesn't have an ID yet.
>>> b2.save()
>>> b2.id     # Returns the ID of your new object.

There’s no way to tell what the value of an ID will be before you call
save(), because that value is calculated by your database, not by Django.

For convenience, each model has an AutoField named
id by default unless you explicitly specify primary_key=True on a field
in your model. See the documentation for AutoField
for more details.

The pk property¶

Model.pk

Regardless of whether you define a primary key field yourself, or let Django
supply one for you, each model will have a property called pk. It behaves
like a normal attribute on the model, but is actually an alias for whichever
attribute is the primary key field for the model. You can read and set this
value, just as you would for any other attribute, and it will update the
correct field in the model.

Explicitly specifying auto-primary-key values¶

If a model has an AutoField but you want to define a
new object’s ID explicitly when saving, define it explicitly before saving,
rather than relying on the auto-assignment of the ID:

>>> b3 = Blog(id=3, name='Cheddar Talk', tagline='Thoughts on cheese.')
>>> b3.id     # Returns 3.
>>> b3.save()
>>> b3.id     # Returns 3.

If you assign auto-primary-key values manually, make sure not to use an
already-existing primary-key value! If you create a new object with an explicit
primary-key value that already exists in the database, Django will assume you’re
changing the existing record rather than creating a new one.

Given the above 'Cheddar Talk' blog example, this example would override the
previous record in the database:

b4 = Blog(id=3, name='Not Cheddar', tagline='Anything but cheese.')
b4.save()  # Overrides the previous blog with ID=3!

See How Django knows to UPDATE vs. INSERT, below, for the reason this
happens.

Explicitly specifying auto-primary-key values is mostly useful for bulk-saving
objects, when you’re confident you won’t have primary-key collision.

If you’re using PostgreSQL, the sequence associated with the primary key might
need to be updated; see Manually-specifying values of auto-incrementing primary keys.

What happens when you save?¶

When you save an object, Django performs the following steps:

  1. Emit a pre-save signal. The pre_save
    signal is sent, allowing any functions listening for that signal to do
    something.

  2. Preprocess the data. Each field’s
    pre_save() method is called to perform any
    automated data modification that’s needed. For example, the date/time fields
    override pre_save() to implement
    auto_now_add and
    auto_now.

  3. Prepare the data for the database. Each field’s
    get_db_prep_save() method is asked to provide
    its current value in a data type that can be written to the database.

    Most fields don’t require data preparation. Simple data types, such as
    integers and strings, are ‘ready to write’ as a Python object. However, more
    complex data types often require some modification.

    For example, DateField fields use a Python
    datetime object to store data. Databases don’t store datetime
    objects, so the field value must be converted into an ISO-compliant date
    string for insertion into the database.

  4. Insert the data into the database. The preprocessed, prepared data is
    composed into an SQL statement for insertion into the database.

  5. Emit a post-save signal. The post_save
    signal is sent, allowing any functions listening for that signal to do
    something.

How Django knows to UPDATE vs. INSERT¶

You may have noticed Django database objects use the same save() method
for creating and changing objects. Django abstracts the need to use INSERT
or UPDATE SQL statements. Specifically, when you call save() and the
object’s primary key attribute does not define a
default, Django follows this algorithm:

  • If the object’s primary key attribute is set to a value that evaluates to
    True (i.e., a value other than None or the empty string), Django
    executes an UPDATE.
  • If the object’s primary key attribute is not set or if the UPDATE
    didn’t update anything (e.g. if primary key is set to a value that doesn’t
    exist in the database), Django executes an INSERT.

If the object’s primary key attribute defines a
default then Django executes an UPDATE if
it is an existing model instance and primary key is set to a value that exists
in the database. Otherwise, Django executes an INSERT.

The one gotcha here is that you should be careful not to specify a primary-key
value explicitly when saving new objects, if you cannot guarantee the
primary-key value is unused. For more on this nuance, see Explicitly specifying
auto-primary-key values above and Forcing an INSERT or UPDATE below.

In Django 1.5 and earlier, Django did a SELECT when the primary key
attribute was set. If the SELECT found a row, then Django did an UPDATE,
otherwise it did an INSERT. The old algorithm results in one more query in
the UPDATE case. There are some rare cases where the database doesn’t
report that a row was updated even if the database contains a row for the
object’s primary key value. An example is the PostgreSQL ON UPDATE trigger
which returns NULL. In such cases it is possible to revert to the old
algorithm by setting the select_on_save
option to True.

Forcing an INSERT or UPDATE¶

In some rare circumstances, it’s necessary to be able to force the
save() method to perform an SQL INSERT and not fall back to
doing an UPDATE. Or vice-versa: update, if possible, but not insert a new
row. In these cases you can pass the force_insert=True or
force_update=True parameters to the save() method.
Passing both parameters is an error: you cannot both insert and update at the
same time!

It should be very rare that you’ll need to use these parameters. Django will
almost always do the right thing and trying to override that will lead to
errors that are difficult to track down. This feature is for advanced use
only.

Using update_fields will force an update similarly to force_update.

Updating attributes based on existing fields¶

Sometimes you’ll need to perform a simple arithmetic task on a field, such
as incrementing or decrementing the current value. One way of achieving this is
doing the arithmetic in Python like:

>>> product = Product.objects.get(name='Venezuelan Beaver Cheese')
>>> product.number_sold += 1
>>> product.save()

If the old number_sold value retrieved from the database was 10, then
the value of 11 will be written back to the database.

The process can be made robust, avoiding a race condition, as well as slightly faster by expressing
the update relative to the original field value, rather than as an explicit
assignment of a new value. Django provides F expressions for performing this kind of relative update. Using
F expressions, the previous example is expressed
as:

>>> from django.db.models import F
>>> product = Product.objects.get(name='Venezuelan Beaver Cheese')
>>> product.number_sold = F('number_sold') + 1
>>> product.save()

For more details, see the documentation on F expressions and their use in update queries.

Specifying which fields to save¶

If save() is passed a list of field names in keyword argument
update_fields, only the fields named in that list will be updated.
This may be desirable if you want to update just one or a few fields on
an object. There will be a slight performance benefit from preventing
all of the model fields from being updated in the database. For example:

product.name = 'Name changed again'
product.save(update_fields=['name'])

The update_fields argument can be any iterable containing strings. An
empty update_fields iterable will skip the save. A value of None will
perform an update on all fields.

Specifying update_fields will force an update.

When saving a model fetched through deferred model loading
(only() or
defer()) only the fields loaded
from the DB will get updated. In effect there is an automatic
update_fields in this case. If you assign or change any deferred field
value, the field will be added to the updated fields.

Field.pre_save() and update_fields

If update_fields is passed in, only the
pre_save() methods of the update_fields
are called. For example, this means that date/time fields with
auto_now=True will not be updated unless they are included in the
update_fields.

Deleting objects¶

Model.delete(using=DEFAULT_DB_ALIAS, keep_parents=False

Issues an SQL DELETE for the object. This only deletes the object in the
database; the Python instance will still exist and will still have data in
its fields, except for the primary key set to None. This method returns the
number of objects deleted and a dictionary with the number of deletions per
object type.

For more details, including how to delete objects in bulk, see
Deleting objects.

If you want customized deletion behavior, you can override the delete()
method. See Overriding predefined model methods for more details.

Sometimes with multi-table inheritance you may
want to delete only a child model’s data. Specifying keep_parents=True will
keep the parent model’s data.

Pickling objects¶

When you pickle a model, its current state is pickled. When you unpickle
it, it’ll contain the model instance at the moment it was pickled, rather than
the data that’s currently in the database.

Other model instance methods¶

A few object methods have special purposes.

__str__()

Model.__str__()¶

The __str__() method is called whenever you call str() on an object.
Django uses str(obj) in a number of places. Most notably, to display an
object in the Django admin site and as the value inserted into a template when
it displays an object. Thus, you should always return a nice, human-readable
representation of the model from the __str__() method.

For example:

from django.db import models

class Person(models.Model):
    first_name = models.CharField(max_length=50)
    last_name = models.CharField(max_length=50)

    def __str__(self):
        return '%s %s' % (self.first_name, self.last_name)

__eq__()

Model.__eq__()¶

The equality method is defined such that instances with the same primary
key value and the same concrete class are considered equal, except that
instances with a primary key value of None aren’t equal to anything except
themselves. For proxy models, concrete class is defined as the model’s first
non-proxy parent; for all other models it’s simply the model’s class.

For example:

from django.db import models

class MyModel(models.Model):
    id = models.AutoField(primary_key=True)

class MyProxyModel(MyModel):
    class Meta:
        proxy = True

class MultitableInherited(MyModel):
    pass

# Primary keys compared
MyModel(id=1) == MyModel(id=1)
MyModel(id=1) != MyModel(id=2)
# Primary keys are None
MyModel(id=None) != MyModel(id=None)
# Same instance
instance = MyModel(id=None)
instance == instance
# Proxy model
MyModel(id=1) == MyProxyModel(id=1)
# Multi-table inheritance
MyModel(id=1) != MultitableInherited(id=1)

__hash__()

Model.__hash__()¶

The __hash__() method is based on the instance’s primary key value. It
is effectively hash(obj.pk). If the instance doesn’t have a primary key
value then a TypeError will be raised (otherwise the __hash__()
method would return different values before and after the instance is
saved, but changing the __hash__() value of an instance is
forbidden in Python.

get_absolute_url()

Model.get_absolute_url()¶

Define a get_absolute_url() method to tell Django how to calculate the
canonical URL for an object. To callers, this method should appear to return a
string that can be used to refer to the object over HTTP.

For example:

def get_absolute_url(self):
    return "/people/%i/" % self.id

While this code is correct and simple, it may not be the most portable way to
to write this kind of method. The reverse() function is
usually the best approach.

For example:

def get_absolute_url(self):
    from django.urls import reverse
    return reverse('people-detail', kwargs={'pk' : self.pk})

One place Django uses get_absolute_url() is in the admin app. If an object
defines this method, the object-editing page will have a “View on site” link
that will jump you directly to the object’s public view, as given by
get_absolute_url().

Similarly, a couple of other bits of Django, such as the syndication feed
framework
, use get_absolute_url() when it is
defined. If it makes sense for your model’s instances to each have a unique
URL, you should define get_absolute_url().

Warning

You should avoid building the URL from unvalidated user input, in order to
reduce possibilities of link or redirect poisoning:

def get_absolute_url(self):
    return '/%s/' % self.name

If self.name is '/example.com' this returns '//example.com/'
which, in turn, is a valid schema relative URL but not the expected
'/%2Fexample.com/'.

It’s good practice to use get_absolute_url() in templates, instead of
hard-coding your objects’ URLs. For example, this template code is bad:

<!-- BAD template code. Avoid! -->
<a href="/people/{{ object.id }}/">{{ object.name }}</a>

This template code is much better:

<a href="{{ object.get_absolute_url }}">{{ object.name }}</a>

The logic here is that if you change the URL structure of your objects, even
for something small like correcting a spelling error, you don’t want to have to
track down every place that the URL might be created. Specify it once, in
get_absolute_url() and have all your other code call that one place.

Note

The string you return from get_absolute_url() must contain only
ASCII characters (required by the URI specification, RFC 2396#section-2)
and be URL-encoded, if necessary.

Code and templates calling get_absolute_url() should be able to use the
result directly without any further processing. You may wish to use the
django.utils.encoding.iri_to_uri() function to help with this if you
are using strings containing characters outside the ASCII range.

Extra instance methods¶

In addition to save(), delete(), a model object
might have some of the following methods:

Model.get_FOO_display()¶

For every field that has choices set, the
object will have a get_FOO_display() method, where FOO is the name of
the field. This method returns the “human-readable” value of the field.

For example:

from django.db import models

class Person(models.Model):
    SHIRT_SIZES = (
        ('S', 'Small'),
        ('M', 'Medium'),
        ('L', 'Large'),
    )
    name = models.CharField(max_length=60)
    shirt_size = models.CharField(max_length=2, choices=SHIRT_SIZES)
>>> p = Person(name="Fred Flintstone", shirt_size="L")
>>> p.save()
>>> p.shirt_size
'L'
>>> p.get_shirt_size_display()
'Large'
Model.get_next_by_FOO(**kwargs
Model.get_previous_by_FOO(**kwargs

For every DateField and
DateTimeField that does not have null=True, the object will have get_next_by_FOO() and
get_previous_by_FOO() methods, where FOO is the name of the field. This
returns the next and previous object with respect to the date field, raising
a DoesNotExist exception when appropriate.

Both of these methods will perform their queries using the default
manager for the model. If you need to emulate filtering used by a
custom manager, or want to perform one-off custom filtering, both
methods also accept optional keyword arguments, which should be in the
format described in Field lookups.

Note that in the case of identical date values, these methods will use the
primary key as a tie-breaker. This guarantees that no records are skipped or
duplicated. That also means you cannot use those methods on unsaved objects.

Overriding extra instance methods

In most cases overriding or inheriting get_FOO_display(),
get_next_by_FOO(), and get_previous_by_FOO() should work as
expected. Since they are added by the metaclass however, it is not
practical to account for all possible inheritance structures. In more
complex cases you should override Field.contribute_to_class() to set
the methods you need.

Other attributes¶

_state

Model._state

The _state attribute refers to a ModelState object that tracks
the lifecycle of the model instance.

The ModelState object has two attributes: adding, a flag which is
True if the model has not been saved to the database yet, and db,
a string referring to the database alias the instance was loaded from or
saved to.

Newly instantiated instances have adding=True and db=None,
since they are yet to be saved. Instances fetched from a QuerySet
will have adding=False and db set to the alias of the associated
database.

Bastian, I explain to you my code templating, I hope that helps to you:

Since django 1.2 it is able to write validation code on model. When we work with modelforms, instance.full_clean() is called on form validation.

In each model I overwrite clean() method with a custom function (this method is automatically called from full_clean() on modelform validation ):

from django.db import models
 
class Issue(models.Model):
    ....
    def clean(self): 
        rules.Issue_clean(self)  #<-- custom function invocation

from issues import rules
rules.connect()

Then in rules.py file I write bussiness rules. Also I connect pre_save() to my custom function to prevent save a model with wrong state:

from issues.models import Issue

def connect():    
    from django.db.models.signals import post_save, pre_save, pre_delete
    #issues 
    pre_save.connect(Issue_pre_save, sender = Incidencia ) 
    post_save.connect(Issue_post_save, sender = Incidencia )
    pre_delete.connect(Issue_pre_delete, sender= Incidencia) 

def Incidencia_clean( instance ):    #<-- custom function 
    import datetime as dt    
    errors = {}

    #dia i hora sempre informats     
    if not instance.dia_incidencia:   #<-- business rules
        errors.setdefault('dia_incidencia',[]).append(u'Data missing: ...')
        
    #dia i hora sempre informats     
    if not  instance.franja_incidencia: 
        errors.setdefault('franja_incidencia',[]).append(u'Falten Dades: ...')
 
    #Només es poden posar incidències més ennlà de 7 dies 
    if instance.dia_incidencia < ( dt.date.today() + dt.timedelta( days = -7) ): 
        errors.setdefault('dia_incidencia 1',[]).append(u'''blah blah error desc)''')
 
    #No incidències al futur. 
    if instance.getDate() > datetime.now(): 
        errors.setdefault('dia_incidencia 2',[]).append(u'''Encara no pots ....''') 
    ... 

    if len( errors ) > 0: 
        raise ValidationError(errors)  #<-- raising errors

def Issue_pre_save(sender, instance, **kwargs): 
    instance.clean()     #<-- custom function invocation

Then, modelform calls model’s clean method and my custon function check for a right state or raise a error that is handled by model form.

In order to show errors on form, you should include this on form template:

{% if form.non_field_errors %}
      {% for error in form.non_field_errors %}
        {{error}}
      {% endfor %}
{% endif %}  

The reason is that model validation erros ara binded to non_field_errors error dictionary entry.

When you save or delete a model out of a form you should remember that a error may be raised:

try:
    issue.delete()
except ValidationError, e:
    import itertools
    errors = list( itertools.chain( *e.message_dict.values() ) )

Also, you can add errors to a form dictionary on no modelforms:

    try:
        #provoco els errors per mostrar-los igualment al formulari.
        issue.clean()
    except ValidationError, e:
        form._errors = {}
        for _, v in e.message_dict.items():
            form._errors.setdefault(NON_FIELD_ERRORS, []).extend(  v  )

Remember that this code is not execute on save() method: Note that full_clean() will not be called automatically when you call your model’s save() method, nor as a result of ModelForm validation. Then, you can add errors to a form dictionary on no modelforms:

    try:
        #provoco els errors per mostrar-los igualment al formulari.
        issue.clean()
    except ValidationError, e:
        form._errors = {}
        for _, v in e.message_dict.items():
            form._errors.setdefault(NON_FIELD_ERRORS, []).extend(  v  )

2013-01-05

In what is probably my biggest WTF with Django to date, it doesn’t
validate your models before saving them to the database. All of the
necessary code exists and when a dev sets up her models she usually adds
the relevant validations using EmailField, URLField, blank, null,
unique, …, but unless you explicitly add code the constraints won’t be
enforced (adequately.) Some things will be caught with IntegrityErrors,
but not everything and not consistently.

Since the validation code is sitting there waiting to be hooked up the
only reason I can imagine for not having it by default is backwards
compatibility. That seems to be the reason
given
elsewhere. It’s a big enough problem in my mind to deserve a breaking
changing in a 1.x release with a configuration variable to disable it
and a large print warning in the release notes. If not that it at least
needs to be featured very prominently in the getting started and general
documentation. If it’s there it’s not obvious enough that I’ve run
across it and Google doesn’t seem to point there when searching for
relevant terms either. Oh well.

So now that I’ve told you how I feel about it, lets get to what to do
about it. You have two basic options. A signal or a base class. Both
have advantages and dis-advantages and I’ll quickly list the ones that
come to mind as we look at the necessary code.

Pre-Save Signal

from django.db.models.signals import pre_save

def validate_model(self, instance, raw=False, **kwargs):
    if not raw:
        instance.full_clean()


pre_save.connect(validate_model, dispatch='validate_model')

Ignoring the fact the method is called full_clean, which seems better
fit for ModelForm checking than Model enforcement, the above code will
check all models used by your app. We connect a handler to the model
pre_save
signal and on each call will make a call to full_clean unless we’re
saving in raw mode (from fixtures.)

The pre_save signal will be sent out for every object being saved
whether it’s one of ours or an upstream dependency’s. That’s both the
advantage and disadvantage of this method. If you use it from the start
all of your code will handle ValidationErrors and as you bring in
3rd-party apps/code you’ll be able to quickly see if it causes problems
for them. But you can run in to problems.

You also shouldn’t use this method if you’re developing a shared app as
it would cause anyone who uses that app to unexpectedly start seeing
ValidationErrors, even if it’s for their own good. You could add senders
to the connect calls for each of your models, but at that point you’re
better off going with the mixin below.

In my use of the signal approach I’ve run in to a problem with custom
Celery Task states. Celery’s docs give examples of arbitrary task
states, but when full_clean is called on them on their way to their
backing store a validation happens that complains about non-standard
values. The easiest way I could find to deal with it was to have a list
of opted out models, it’s not the cleanest thing in the world, but it
gets the job done.

dont_validate = {'TaskMeta'}


def validate_model(self, instance, raw=False, **kwargs):
    cls = instance.__class__.__name__
    if not raw and cls not in dont_validate:
        instance.full_clean()

Mixin With An Overridden save

from django.db import models


class ValidateOnSaveMixin(object):

    def save(self, force_insert=False, force_update=False, **kwargs):
        if not (force_insert or force_update):
            self.full_clean()
        super(ValidateOnSaveMixin, self).save(force_insert, force_update,
                                              **kwargs)


class Employee(ValidateOnSaveMixin, models.Model):
    name = models.CharField(max_length=128)
    # need to specify the max_length here or else it'll be too short for
    # some rfc emails
    email = EmailField(max_length=254, unique=True)

Basically the same logic, but here it’s explicit which models are going
to be validated. This is essentially the opposite of the signal
approach. You don’t have to worry about other models validating
correctly or code working with them handling ValidationErrors, but you
do have to explicitly include ValidateOnSaveMixin in each model’s
hierarchy. You’ll also have to take a bit of care if you override the
save method in any of the classes where the mixin is used to make sure
you do things in an appropriate order and that the mixin’s save method
is called.

Things to Watch For

One thing to consider with either of these approaches is that you cannot
rely on pre_save signals or field save methods to make objects valid.
Both would happen too late. In the case of the mixin, after we’ve called
full_clean and pass things up to super. with the pre_save signal
field’s save methods are called at a later point and there’s no
assurances on the order of signal handlers so you can’t rely on the
fixers being called before validate_model.

Unit Testing

I’m fan of thorough unit testing and this is a place when it can come in
extra handy and the tests are trivial to write. You don’t have to test
the actual validation unless you’re doing something custom, you can
hope/assume that the Django unit tests have that covered. You can/should
check that validations are being invoked.

from django.test import TestCase
from django.core.exceptions import ValidationError


class EmployeeTest(TestCase):

    def test_validation(self):

        with self.assertRaises(ValidationError):
            Employee(name='Bob', email='this.is.not.an.email').save()

That’s enough of a smoke test to tell you whether or not the validation
mixin or signal is getting called. If 6 months down the road you tweak
the signal handler or change the inheritance hierarchy you’ll have tests
in place to make sure that things are still being validated.

Make Django Rest Framework correctly handle Django ValidationError raised in the save method of a model


This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters

Show hidden characters

«»»
Sometimes in your Django model you want to raise a «ValidationError« in the «save« method, for
some reason.
This exception is not managed by Django Rest Framework because it occurs after its validation
process. So at the end, you’ll have a 500.
Correcting this is as simple as overriding the exception handler, by converting the Django
«ValidationError« to a DRF one.
«»»
from django.core.exceptions import ValidationError as DjangoValidationError
from rest_framework.exceptions import ValidationError as DRFValidationError
from rest_framework.views import exception_handler as drf_exception_handler
def exception_handler(exc, context):
«»»Handle Django ValidationError as an accepted exception
Must be set in settings:
>>> REST_FRAMEWORK = {
… # …
… ‘EXCEPTION_HANDLER’: ‘mtp.apps.common.drf.exception_handler’,
… # …
… }
For the parameters, see «exception_handler«
«»»
if isinstance(exc, DjangoValidationError):
exc = DRFValidationError(detail=exc.message_dict)
return drf_exception_handler(exc, context)

Понравилась статья? Поделить с друзьями:
  • Django recursion error
  • Django permission error
  • Django parse error
  • Django no such table error
  • Django nginx 500 internal server error