I am not able to login into my postgres databse deployed in docker. PFB my docker-compose.yml discountdb: image: postgres docker-compose.override.yml d
How do I use the knex db object inside other files? For Example my index.js const app = require("express")(); const cors = require("cors"); const bodyParser
I have some lines of SQL which will take a set of IDs from the same GROUP_ID that are not contiguous (ex. if some rows got deleted) and will make them contiguou
I have a Kubernetes cluster that I am trying to deploy different Helm Charts, when charts have no persistence everything works great. When the Helm charts use p
I am using Github Actions Service Container to start a postgres instance like so: name: Pull Request on: pull_request: branches: - main - sta
I am using typeorm in typescript project to connect to postresql 11. I have below query in sql which gives me expected results: select "customerUuid", array_agg
I'm migrating a query of Oracle pivot to PostgreSQL crosstab. create table(cntry numeric,week numeric,year numeric,days text,day text); insert into x_c values(
I have a table which stores users information(id, name, surname, address, email, etc). I don't want to use auto-increment function of the database. So, we don't
I have two postgres databases, one for prod and another for testing in "DATABASE_URI" and "TESTDATABASE_URI" respectively. The app seems to working perfectly, w
I am new to PostgreSQL and I need to import a set of csv files, but some of them weren't imported successfully. I got the same error with these files: ERROR: e
The issue: I'm not sure how to begin problem solving this or where to look for errors. Setup: Running a node.js backend via GCR and it's hitting a postgres db o
I'm trying to understand MVCC and can't get it. For example. Transaction1 (T1) try to read some data row. In that same time T2 update the same row. The flow of
How can I move the array bigint value from one index to another? For example, I have an array ARRAY[1, 2, 3, 4] of bigint unique values and want to move value 1
Requirement: To load millions of rows into a table from S3 using Python and avoid memory issue I see there are two methods psycopg2's copy_from and copy_expert
I am trying to create spark Dataframe from presto db table which has few columns as Array DataType. I tried multiple ways but I am getting same exception java.s
I have a date field in one of my tables and the column name is from_dt. Now I have to compare a month and year combination against this from_dt field and check
[NOTE: This is using Django 4.0.2, Python 3.8.2, and Postgres 14.2.] I have successfully set up Django and Postgres, and I can get them to work together when I
I'm using postgREST to generate a http accessible api (great stuff). There is a little bug that I'm trying to work around. For whatever reason, when calling a
I am trying to migrate a large portion of one postgres database to another postgres database that has a slightly different layout/table names/column names. But
I'm trying to run postgres in a docker container on windows. I also want keep the data in a windows folder, so I tried this: mkdir c:\pgdata PS > docker r