I'm beginning to learn about Golang and I would like to have some advice about the following program.
package main
import (
    "fmt"
    "net/http"
    "time"
)
const BenchmarkTry = 1000
type PageBenchmark struct {
    url  string
    time int64 // microseconds
}
func execBenchmark(url string, channel chan PageBenchmark) {
    totalExecTimeChan := make(chan int64, BenchmarkTry) // set size to prevent blocked goroutine
    totalExecTime := int64(0)
    // start all the goroutines
    for i := 0; i < BenchmarkTry; i++ {
        go execHttpRequest(url, totalExecTimeChan)
    }
    // catch new values from totalExecTimeChan values come from execHttpRequest()) and add it to the total
    for i := 0; i < BenchmarkTry; i++ {
        totalExecTime += <-totalExecTimeChan // waiting to get a value from one of the goroutines started in the previous for loop
    }
    channel <- PageBenchmark{url, totalExecTime / BenchmarkTry}
}
// exec http request and attach exec time to channel
func execHttpRequest(url string, channel chan int64) {
    begin := time.Now()
    _, _ = http.Get(url)
    channel <- time.Since(begin).Nanoseconds() / 1000000 // convert to milliseconds
}
func main() {
    sites := [...]string{
    }
    pages := [...]string{
    }
    benchmarkChan := make(chan PageBenchmark, len(sites)*len(pages)) // set size to prevent blocked goroutine
    begin := time.Now()
    fmt.Println("Beginning !")
    // start all the goroutines
    for site := range sites {
        for page := range pages {
            go execBenchmark(sites[site]+pages[page], benchmarkChan)
        }
    }
    // catch new values from benchmarkChan and "print" the PageBenchmark
    for i := 0; i < len(sites)*len(pages); i++ {
        benchmark := <-benchmarkChan
        fmt.Printf("Url : %v\nResponse Time : %d ms\n\n", benchmark.url, benchmark.time)
    }
    // print execution time
    fmt.Println("End.")
    fmt.Println(fmt.Sprintf("%d ms", time.Since(begin).Nanoseconds()/1000000))
}
Basically, I'm making HTTP requests (GET method) to multiple URLs on multiple web sites. 1000 requests by URL in this example.
For now, I just want to start some goroutines in this order:
- Main: start a benchmark (goroutine) for each web site pages, get the average execution time and print it. 
- Page routine: start 1000 goroutines (benchmark tries), get the execution times from a channel and store the average execution time in an other channel. 
- Execute an HTTP request on the page and store the execution time in a channel. 
This piece of code works but there may be things I'm missing.
- Is the execution scheduling valid?
- Is it appropriate to use defined size channels here?
- Is there a better/more effective way to achieve this task?

