-2
\$\begingroup\$

Pprof Profiling Snapshot

I have a critical user facing Go API in which I call an internal API. The API response time is <200ms but the response unmarshalling with goccy JSON is taking 500ms. The response has about 1k objects so I don't think it should take this much time, not sure though.

NOTE: Earlier I was using Go's default encoding/JSON which was taking 1.5s just to unmarshal. Migrating to goccy reduced to 500ms but I think it is still pretty high considering API response time is 200ms. Attaching API call code below.

func (a *AllePimGatewayImpl) GetProductsByIDsBulk(productIDs []int) []models.AlleProductDTO {
    if len(productIDs) == 0 {
        return []models.AlleProductDTO{}
    }

    headers := map[string]string{
        "Content-Type":  "application/json",
        "Authorization": fmt.Sprintf("Bearer %s", a.authToken),
    }

    req := PimBulkIDRequest{IDs: productIDs}

    resp, err := a.client.R().
        SetHeaders(headers).
        SetBody(req).
        Post(a.config.PimConfig.BaseUrl + "/pim/api/v1/products/bulk")

    if err != nil {
        log.WithError(err).Errorf("Error calling PIM API to get products by IDs bulk: %d IDs", len(productIDs))
        return []models.AlleProductDTO{}
    }

    if resp.StatusCode() != http.StatusOK {
        log.Errorf("Received non-200 status from PIM API: %s for products bulk request", resp.Status())
        return []models.AlleProductDTO{}
    }
    var response PimProductsBulkResponse
    err = json.Unmarshal(resp.Body(), &response)
    if err != nil {
        log.WithError(err).Errorf("Error unmarshaling products bulk response")
        return []models.AlleProductDTO{}
    }

    return response.Products
}

Tried migrating to goccy/go-json, easyjson, stream decoding but goccy was the fastest.

This is an internal API which I have control over. So, If there is some change that can be done on the other side of the API to reduce this overhead, please suggest that too.

\$\endgroup\$
3
  • 1
    \$\begingroup\$ Can you include the definitions of the structures? Otherwise this is all just guesswork based on a very small snippet of code to be honest. \$\endgroup\$ Commented Oct 8 at 10:10
  • 1
    \$\begingroup\$ This does not appear to be a request for a code review. It should not have been migrated here. \$\endgroup\$ Commented Oct 8 at 12:08
  • 1
    \$\begingroup\$ I’m voting to close this question because this does not seem to be a request for a code review. \$\endgroup\$ Commented Oct 8 at 12:21

1 Answer 1

-1
\$\begingroup\$

Not providing code review because you are clearly not asking for one.

Great to see you had decided to try alternative decoders. That proves you are applying some of that software engineering to your work. As you aren't satisfied with merely implementation improvements, it might be worth to step back to the design phase. I guess some of the promising roads are:

  • to design other query method than "bulk",
  • to cache data,
  • to use a different encoding than JSON.
\$\endgroup\$