• 0 Posts
  • 681 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle
  • Really depends on many factors. If you have everything in RAM, almost nothing matters.

    If your dataset outgrows the capacity, various things start to matter, based on your workload. Random reads need to have good indices (also writes with unique columns), OLAPs benefit from work_mem, >100M rows will need good partitioning, OLTP may even need some custom solutions if you need to keep a long history, but not for every transaction.

    But even with >B of rows, Postgres can handle it with relative ease, if you know what you’re doing. Usually even on a hardware you would consider absolutely inadequate (last year I migrated our company DB from MySQL to Postgres, and with even more data and more complex workflows we downsized our RAM by more than half).