Hi,
I managed to hunt a bug back to #32 . I use PG 9.2 on OSX, and the native JSON type has been causing me a headache in some very corner-cased situation. Here the stacktrace given by Django:
http://dev.1flow.net/development/1flow-dev/group/82/
As you can see, the traditionnal Django message about transaction aborted doesn’t help at all, but if I manage to run the culprit SQL query on my PG9.2, I get:
ERROR: could not identify an equality operator for type json
LINE 1: ...mail_announcements", "base_user"."last_modified", "base_user...
^
********** Erreur **********
ERROR: could not identify an equality operator for type json
État SQL :42883
Caractère : 181
Having trying to hunt down the problem during a fair long amount of time, I found these:
http://michael.otacoo.com/postgresql-2/postgres-9-2-highlight-json-data-type/
http://michael.otacoo.com/postgresql-2/postgres-9-3-feature-highlight-json-operators/
Which clearly define that JSON operators comes only in PG9.3. I thus consider the 9.2 support for JSON kind of incomplete. In django, selecting any json column in a query seems to crash PG. Any request on the table, but not involving the json fields, is OK.
On my production machines I have PG 9.1 and luckily the JSON fields are stored as text
there, so they don’t crash Django. I think that in jsonfield/fields.py
:
def db_type(self, connection):
if connection.vendor == 'postgresql' and connection.pg_version >= 90200:
return 'json'
else:
return super(JSONFieldBase, self).db_type(connection)
Should really be connection.pg_version >= 90300:
To avoid messing with the 9.2 incomplete implementation. In fact, i’ve create a pull request because reverting to text
makes everything work again on my 9.2 server. Sorry, I don’t seem to find how to merge the pull request with the current issue in the github interface.
NOTE: I have no PG 9.3 to test if the implementation works or not.
Best regards,
If you use PG 9.4 , using JSONB rather than JSON solves this problem
Example :
-- JSON datatype test
create table t1 (id int, val json);
insert into t1 (id,val) values (1,'{"name":"value"}');
insert into t1 (id,val) values (1,'{"name":"value"}');
insert into t1 (id,val) values (2,'{"key":"value"}');
select * from t1 order by id;
select distinct * from t1 order by id;
-- JSONB datatype test
create table t2 (id int, val jsonb);
insert into t2 (id,val) values (1,'{"name":"value"}');
insert into t2 (id,val) values (1,'{"name":"value"}');
insert into t2 (id,val) values (2,'{"key":"value"}');
select * from t2 order by id;
select distinct * from t2 order by id;
Result of running the above script :
CREATE TABLE
INSERT 0 1
INSERT 0 1
INSERT 0 1
1 | {"name":"value"}
1 | {"name":"value"}
2 | {"key":"value"}
ERROR: could not identify an equality operator for type json
LINE 1: select distinct * from t1 order by id;
^
CREATE TABLE
INSERT 0 1
INSERT 0 1
INSERT 0 1
1 | {"name": "value"}
1 | {"name": "value"}
2 | {"key": "value"}
1 | {"name": "value"}
2 | {"key": "value"}
As you can see PG succeeded to imply DISTINCT on a JSONB column
while it fails on a JSON column !
Try also the following to see that actually keys in the JSONB are sorted :
insert into t2 values (3, '{"a":"1", "b":"2"}');
insert into t2 values (3, '{"b":"2", "a":"1"}');
select * from t2;
1 | {"name": "value"}
1 | {"name": "value"}
2 | {"key": "value"}
3 | {"a": "1", "b": "2"}
3 | {"a": "1", "b": "2"}
note that ‘{«b»:»2″, «a»:»1″}’ was inserted as ‘{«a»:»1″, «b»:»2″}’
therefor PG identifies that as the same record :
select distinct * from t2;
3 | {"a": "1", "b": "2"}
2 | {"key": "value"}
1 | {"name": "value"}
Yeah, unfortunately postgres json
doesn’t implement equality, but jsonb
does. So migrating json
columns to jsonb
and it should work okay.
Sorry I’m late on this answer, but it might help others.
As I understand your query, you’re only getting possible duplicates on profiles
because of the many-to-many join to integrations
(which you’re using to determine which profiles
to access).
Because of that, you can use a new GROUP BY
feature as of 9.1:
When GROUP BY is present, it is not valid for the SELECT list expressions to refer to ungrouped columns except within aggregate functions or if the ungrouped column is functionally dependent on the grouped columns, since there would otherwise be more than one possible value to return for an ungrouped column. A functional dependency exists if the grouped columns (or a subset thereof) are the primary key of the table containing the ungrouped column.
So in your case, you could get Ruby to create the query (sorry, I don’t know the Ruby syntax you’re using)…
SELECT profiles.*
FROM "profiles"
INNER JOIN "integration_profiles" ON "profiles"."id" = "integration_profiles"."profile_id"
INNER JOIN "integrations" ON "integration_profiles"."integration_id" = "integrations"."id"
WHERE "integrations"."user_id" = $1
GROUP BY "profiles"."id"
I only removed the DISTINCT
from your SELECT
clause and added the GROUP BY
.
By referring ONLY to the id
in the GROUP BY
, you take advantage of that new feature because all the remaining profiles
columns are «functionally dependent» on that id primary key.
Somehow, wonderfully that avoids the need for Postgres to do equality checks on the dependent columns (ie your json
column in this case).
The DISTINCT ON
solution is also great, and clearly sufficient in your case, but you can’t use aggregate functions like array_agg
with it. You CAN with this GROUP BY
approach. Happy days!
The reason behind this, is that in PostgreSQL (up to 9.3) there is no equality operator defined for json
(i.e. val1::json = val2::json
will always throw this exception) — in 9.4 there will be one for the jsonb
type.
One workaround is, you can cast your json
field to text
. But that won’t cover all json equalitions. f.ex. {"a":1,"b":2}
should be equal to {"b":2,"a":1}
, but won’t be equal if casted to text
.
Another workaround is (if you have a primary key for that table — which should be) you can use the DISTINCT ON (<expressions>)
form:
u.profiles.select("DISTINCT ON (profiles.id) profiles.*")
Note: One known caveat for DISTINCT ON
:
The DISTINCT ON expression(s) must match the leftmost ORDER BY expression(s). The ORDER BY clause will normally contain additional expression(s) that determine the desired precedence of rows within each DISTINCT ON group.
Questions : How to deal with JSON column while using GROUP BY
2023-02-07T02:58:17+00:00 2023-02-07T02:58:17+00:00
616
I’m using a query similar to the below one, issuse uvdos postgresql address is JSON TYPE.
SELECT id, name, MAX(salary), age, adress FROM test group by id, name, age
But getting below error:
SQL Error [42883]: ERROR: could not identify an equality operator for type json
Position: 152
I’m trying to get the data of a person who issuse uvdos postgresql has the max salary for his age and I need to issuse uvdos postgresql include adress filed which should be JSON
So, Is the there any way to achieve this or issuse uvdos postgresql is this practically possible ?
Note:- postgres db
Total Answers 2
24
Answers 1 : of How to deal with JSON column while using GROUP BY
I’d go with DISTINCT ON() instead:
SELECT DISTINCT ON (age) id, name, salary, age, adress
FROM test
ORDER BY age, salary desc
The DISTINCT ON (age) will give you one solved uvdos postgresql row for each age. The one with the solved uvdos postgresql highest salary, as the ORDER BY decides.
0
2023-02-07T02:58:17+00:00 2023-02-07T02:58:17+00:00Answer Link
mRahman
3
Answers 2 : of How to deal with JSON column while using GROUP BY
There is no built-in equality operator solved uvdos postgresql for type JSON but there is one for type solved uvdos postgresql JSONB. So you have a couple of immediate solved uvdos postgresql options:
- Change the type of column
adress
from JSON to JSONB (recommended). Your query will work then; - Cast
adress
to type JSONB in the query
SELECT id, name, MAX(salary), age, adress::jsonb
FROM test
GROUP BY id, name, age, address::jsonb;
You can also define an equality operator solved uvdos postgresql for type JSON yourself — well, rather as solved uvdos postgresql a drill than for production purposes.
0
2023-02-07T02:58:17+00:00 2023-02-07T02:58:17+00:00Answer Link
rohim