'GO routines in a loop - function call
Description of the problem
I want to send a concurrent call to each iteration on my for loop.
Current Code
This program mostly try's to find a valid directory on the server
package main
import (
"bufio"
"fmt"
"net/http"
"os"
)
func main() {
var site string
if len(os.Args) > 1 {
site = os.Args[1]
} else {
fmt.Println("Need the host. Example > http://www.google.com/")
fmt.Println("wfuzz http://www.google.com/ wordlist.txt")
site = "http://www.google.com"
os.Exit(1)
}
f, err := os.Open(os.Args[2])
if err != nil {
fmt.Println("error opening file!", err)
}
scanner := bufio.NewScanner(f)
for scanner.Scan() {
a := site + scanner.Text()
get_page(a)
}
if err := scanner.Err(); err != nil {
fmt.Println("error", err)
}
var input string
fmt.Scanln(&input)
fmt.Println("done")
}
func get_page(site string) string {
a := "\n"
res, err := http.Get(site)
if err != nil {
fmt.Println("error:", err)
}
if res.StatusCode != 404 {
fmt.Println("Page Found!! +::", site, " ::+", res.StatusCode)
}
return a
}
Goal to Achieve
What I want to achieve is mostly send on my for loop concurrent calls that way i would speed up the process and would be the end goal for the current program.
Solution 1:[1]
Using a combination of the Go standard libraries sync.WaitGroup and channels, you can accomplish this without having to explicitly keep track of the number of words in the file you wish to read from.
Modify your get_page (side note: it is not idiomatic go to user underscores in func names -> https://golang.org/doc/effective_go.html#names ) func to accept an additional channel and wait group and you could do something like the following:
package main
import (
"bufio"
"fmt"
"net/http"
"os"
"sync"
)
func main() {
//your previous code for getting the command line arguments from os.Args, then...
//create a WaitGroup
wg := new(sync.WaitGroup)
//create a channel that you can write your results too
resultChan := make(chan string)
scanner := bufio.NewScanner(f)
for scanner.Scan() {
//increment the waitgroup
wg.Add(1)
a := site + scanner.Text()
//call the getPage with the http target, with the result chan, and the waitgroup
go getPage(a, resultChan, wg)
}
if err := scanner.Err(); err != nil {
fmt.Println("error", err)
//if there is an error, close the result chan so progam can exit (for more robust cancellation of potential currently running go routines, take a look at https://blog.golang.org/pipelines)
close(resultChan)
}
//start another go routine to wait for all concurrent calls to get_page to complete, then close the result channel
go func() {
wg.Wait()
close(resultChan)
}()
for result := range resultChan {
//deal with the result in some way
fmt.Println(result)
}
fmt.Println("done")
}
//side note - it is not idiomatic go to use underscores in func name
func getPage(site string, ch chan<- string, wg *sync.WaitGroup) {
//decrement waitgroup when operation completes
defer wg.Done()
res, err := http.Get(site)
if err != nil {
ch <- fmt.Sprintf("error: %s", err)
return
}
if res.StatusCode != 404 {
ch <- fmt.Sprintf("Page Found!! +:: %s ::+ %d", site, res.StatusCode)
}
}
Solution 2:[2]
You can use
sync.WaitGroup
The only problem is that you have how many concurrent calls you are gonna do.
example
if the wordlist.txt contents this
admin.php
admin
administrator
administrator
you know that you will make 4 concurrent calls
if the content is
admin.php
admin/
administrator/
administrator/
administrator/
administrator/
you know you will make 6 concurrent calls.
The code below was make with a file of 3 words
package main
import (
"fmt"
"sync"
"bufio"
"net/http"
"os"
)
func main(){
var site string
var wg sync.WaitGroup
if len(os.Args) > 1 {
site = os.Args[1]
} else {
fmt.Println("Need the host. Example > http://www.google.com/")
fmt.Println("wfuzz http://www.google.com/ wordlist.txt")
site = "http://www.google.com"
os.Exit(1)
}
f, err := os.Open(os.Args[2])
if err != nil {
fmt.Println("error opening file!", err)
}
wg.Add(3) // TODO check the number of words in the file
scanner := bufio.NewScanner(f)
for scanner.Scan() {
a := site + scanner.Text()
go get_page(a,&wg)
}
if err := scanner.Err(); err != nil {
fmt.Println("error", err)
}
wg.Wait()
fmt.Println("done")
}
func get_page(site string,wg *sync.WaitGroup) string{
a := "\n"
defer wg.Done()
res, err := http.Get(site)
if err != nil {
fmt.Println("error:", err)
}
if res.StatusCode != 404 {
fmt.Println("Page Found!! +::", site, " ::+", res.StatusCode)
}
return a
}
Solution 3:[3]
You can use go get_page(a) which start a goroutine for each word on the wordlist, to set maximum concurrent calls use work pool. (https://gobyexample.com/worker-pools)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | |
| Solution 2 | Carlos Andrés García |
| Solution 3 | Nima Ghotbi |
