1. B M
  2. Valentina Studio
  3. Samstag, Januar 26 2019, 04:33 AM
  4.  Abonnieren via E-Mail
I have noticed recently that a table with many rows using a column of 'Bytea' type in PostgreSQL (version does not matter but for the case of record keeping that would be 10 and 11), takes a long time to load. Looking at the system monitor it appears that each time I click on such a table, it tried to download all blobs (bytea) before display any row content. I think that this is an issue. Is there a way to delay pre-loading of blob objects so it does not impact performance? When I double click on a cell in a table with a blob, I do not mind waiting because I know that this is the action I wanted to take. However, it seems to be excessive to be "penalized" for clicking only on a table to see its rows, doesn't it?

Do you have any comments/insights on this? I would greatly appreciate it.
Kommentar
There are no comments made yet.
Sergey Pashkov Akzeptierte Antwort
Hello,

It loads first 32 bytes for each blob value of the current page (512 rows), to try to detect the type of stored data.
We'll make it adjustable.

Please add a few details so we'll be able to test Data Editor in the same conditions.
Is there a primary key in this table? How many rows and fields?
What is the average size of the blob value?
Kommentar
There are no comments made yet.
B M Akzeptierte Antwort
I will start with an easy one. In one table I have currently 40 rows will all of them (except for 1) of type PDF (pdf files) and 1 is a JPG image. PDF fiels vary in size with the biggest one being about 38Mb. If I read your message correctly, then, 32 bytes X 40 = 1,280 bytes (1K). That should be instant, I think. However, this is not the case. Looking at my network monitor, it appears it pulls everything so the table appears to be frozen for maybe 10 - 20 seconds.

Things get more difficult with a table having more rows. On a table with 418 rows where mostly images are stored in the blob (small screenshots), it takes 53 seconds before Studio becomes responsive again returning all the rows. I do it through the Schema Editor in case you were wondering by clicking on 'Tables' folder and then on the table of interest to me.

To be fair, I use Studio to connect to my PostgreSQL 11 via a network. My database is on a Linux system while Studio runs on Windows. In any case, however, if what you were saying above is indeed true, than 32 X 418 = 13,376 (13Mb). Again, in this case, 13 mb / 53 s = 0.24. That's like 240K per second. So either my calculations are wrong or you are loading much more than 32 bytes initially.
Kommentar
There are no comments made yet.
B M Akzeptierte Antwort
To add to my answer, the problem I am describing does not seem to exist when running Studio on the same server as the database. When running the same examples using Studio 9 beat on the same Linux server as my databases, returns were instant. So, it seems that the issue is limited to a network connection between the Studio and remote database.
Kommentar
There are no comments made yet.
Sergey Pashkov Akzeptierte Antwort
Thank you, and in both tables, there is a primary key, yes?
Kommentar
There are no comments made yet.
B M Akzeptierte Antwort
Thank you, and in both tables, there is a primary key, yes?
Yes. In both cases there is a primary serial key.
Kommentar
There are no comments made yet.
Sergey Pashkov Akzeptierte Antwort
Hello,

Can't reproduce this problem with a remote server.

What if you open the Data Editor for such a table and then go to the Query Log - will you see some long running queries?

For example, the following query takes 0.080 s in my test database
SELECT "pk", substring( "data" from 0 for 32 ) AS "data" FROM "public"."books" ORDER BY "pk" ASC OFFSET 0 LIMIT 500

Also, if you wish, you can try to connect to my test instance (it's on Heroku):
Host
ec2-54-247-101-191.eu-west-1.compute.amazonaws.com
Database
d9cfmoqqa1lvf3
User
lksoeoxpcsujvb
Port
5432
Password
64f928c2b36a12e81a2aa5009b1f25b0c2ccc258b7123f74ff6a06e6b6da3400
SSL
*
Kommentar
There are no comments made yet.
B M Akzeptierte Antwort
Hello,

Can't reproduce this problem with a remote server. [...]

*


I am including a screenshot of what I see when trying to access a table with a blob. I also tried the database you provided. I have the same issue there.
Anhänge
Kommentar
There are no comments made yet.
Sergey Pashkov Akzeptierte Antwort
Thanks for the screenshot, so it is in the Schema Editor, not the Data Editor which I tested.
Fix will be available in version 9.
Kommentar
There are no comments made yet.
B M Akzeptierte Antwort
Thanks for the screenshot, so it is in the Schema Editor, not the Data Editor which I tested.
Fix will be available in version 9.

Thanks Sergey. Do you know the ETA for the release?
Kommentar
There are no comments made yet.
Sergey Pashkov Akzeptierte Antwort
It should be available in 10 days.
Kommentar
There are no comments made yet.
Sergey Pashkov Akzeptierte Antwort
A beta version with this fix is available for download:
http://valentina-db.com/download/beta/9.0b20/win_64/vstudio_x64_9_win.exe
Kommentar
There are no comments made yet.
  • Seite :
  • 1


There are no replies made for this post yet.
However, you are not allowed to reply to this post.

Categories

Announcements & News
  1. 0 subcategories
Valentina Studio
  1. 2 subcategories
Valentina Server
  1. 4 subcategories
Valentina Database ADK
  1. 0 subcategories
Valentina Reports ADK
  1. 0 subcategories
Other Discussions
  1. 2 subcategories
BETA Testing
  1. 0 subcategories
Education & Research
  1. 0 subcategories