I asked something similar recently and it got solved by using AsQueryable: How to save time when pulling data from DB using Entity Framework Core in controller
I'm working on an implementation of Binary Search in Python as part of a course (Algorithmic Toolbox on Coursera). The challenge is to create an implementation
I'm working with a very long dataframe, so I'm looking for the fastest way to fill several columns at once given certain conditions. So let's say you have this
I have this huge array of strings, saved in a JSON file on a remote server (the file size is 2MB and increasing).. In my front-end code, I need to constantly lo
This is how I solved the following question, I want to be sure if my solution is correct? A multiprocessor consists of 100 processors, each capable of a peak ex
I have a powershell script that reads and parses a text file. The file is read into memory and then processed line by line. When I switched from Powershell 4.0
I have MariaDB on my server with 16/32 CPU cores, Everything seems to be ok when running mysqltuner except InnoDB Write Log efficiency, taking 1953.15%, wonderi
I do a lot of operations with splitting numbers into separate digits, putting digits in ArrayList and passing this digits one by one to other ArrayList for furt
Edit: Turns out the performance problems only happen during development. Once Gatsby builds the project, everything runs well. I'd be curious to hear if anyone
i have a very simple script, it just scrapes some tables off the internet and inserts it into a db. however: tickerlist contains about 8000 rows. and the script
I have a table "studies" with 14 subtables that are connected to "studies" via a foreign key that refers to the primary key in "studies". I need to delete all r
Description Our Application is based of DSP synthesizer mostly used to create music, written in C Language, and I want to create a system-wide feature to give v
I am trying to speed up the calculations from multiple operations that I am adding as columns in a pyspark data frame, when I found the sparkbyexamples article
So I've build a static website using flutter (web-app) which is hosted on Firebase Hosting. I checked my Lighthouse report using DevTools by chrome, and i get t
I have two lists: a list of about 750K "sentences" (long strings) a list of about 20K "words" that I would like to delete from my 750K sentences So, I have to l
I'm trying to mimic Lighthouse performance calculator but I don't know the formula for it. Say the weighted percentage from their website is: FCP : 10% SI : 10%
This is the command I'm using : dallData <- fread("data.csv", showProgress = TRUE, colClasses = c(rep("NULL", 2), "character", rep("NULL", 37))) but I get t
Write a program that takes in a line of text as input, and outputs that line of text in reverse. The program repeats, ending when the user enters "Done", "done"
In my application, I would like to combine a group of HTTP samplers as one transaction, and have a constant throughput on the group.. ie. when I created a threa
I am trying to develop an image_compressor web project. I am confused about the best image type for faster page loading speeds and best compression practices. P