Category "performance"

EntityFrameworkCore Performance tuning when filter one dbcontext object using the other dbcontext object

I asked something similar recently and it got solved by using AsQueryable: How to save time when pulling data from DB using Entity Framework Core in controller

Binary Search with results returned as Indices

I'm working on an implementation of Binary Search in Python as part of a course (Algorithmic Toolbox on Coursera). The challenge is to create an implementation

Fastest way to fill multiple columns by a given condition on other columns pandas

I'm working with a very long dataframe, so I'm looking for the fastest way to fill several columns at once given certain conditions. So let's say you have this

What is the best practice to quickly access big amount of data in the web browser?

I have this huge array of strings, saved in a JSON file on a remote server (the file size is 2MB and increasing).. In my front-end code, I need to constantly lo

What is the performance of 100 processors capable of 2 GFLOPs running 2% sequential and 98% parallelizable code?

This is how I solved the following question, I want to be sure if my solution is correct? A multiprocessor consists of 100 processors, each capable of a peak ex

performance issue in PowerShell 5.1

I have a powershell script that reads and parses a text file. The file is read into memory and then processed line by line. When I switched from Powershell 4.0

InnoDB Write Log efficiency is too high more than 100% (1953.15%)?

I have MariaDB on my server with 16/32 CPU cores, Everything seems to be ok when running mysqltuner except InnoDB Write Log efficiency, taking 1953.15%, wonderi

What is the fastest way to split Integer into digits?

I do a lot of operations with splitting numbers into separate digits, putting digits in ArrayList and passing this digits one by one to other ArrayList for furt

Performance improvements for split screen with scroll overflow

Edit: Turns out the performance problems only happen during development. Once Gatsby builds the project, everything runs well. I'd be curious to hear if anyone

can i make my scraping (pandas read html) script faster?

i have a very simple script, it just scrapes some tables off the internet and inserts it into a db. however: tickerlist contains about 8000 rows. and the script

DELETE CASCADE in PostgreSQL extremely slow

I have a table "studies" with 14 subtables that are connected to "studies" via a foreign key that refers to the primary key in "studies". I need to delete all r

Profile CPU Usage of DSP objects in Realtime

Description Our Application is based of DSP synthesizer mostly used to create music, written in C Language, and I want to create a system-wide feature to give v

Pyspark performance tunning - cache or not to cache?

I am trying to speed up the calculations from multiple operations that I am adding as columns in a pyspark data frame, when I found the sparkbyexamples article

Lighthouse Performance improvement for flutter [Scroll]

So I've build a static website using flutter (web-app) which is hosted on Firebase Hosting. I checked my Lighthouse report using DevTools by chrome, and i get t

Speed up millions of regex replacements in Python 3

I have two lists: a list of about 750K "sentences" (long strings) a list of about 20K "words" that I would like to delete from my 750K sentences So, I have to l

Google Lighthouse Performance Calculator Formula

I'm trying to mimic Lighthouse performance calculator but I don't know the formula for it. Say the weighted percentage from their website is: FCP : 10% SI : 10%

Loading CSV with fread stops because of to large string

This is the command I'm using : dallData <- fread("data.csv", showProgress = TRUE, colClasses = c(rep("NULL", 2), "character", rep("NULL", 37))) but I get t

Python: Print string in reverse

Write a program that takes in a line of text as input, and outputs that line of text in reverse. The program repeats, ending when the user enters "Done", "done"

Group multiple HTTp samplers as one transaction, and have constant throughput

In my application, I would like to combine a group of HTTP samplers as one transaction, and have a constant throughput on the group.. ie. when I created a threa

i have problem with compress my images please advise me best way, this in my code

I am trying to develop an image_compressor web project. I am confused about the best image type for faster page loading speeds and best compression practices. P