The usual practice is to make a distinction between OLTP for day-to-day transaction processing, where users need high speed response time, and OLAP for in-depth analytics (dashboards and other analytics) that require more resources but can wait a little longer.
OLAP generally uses different data structures (e.g. cubes, snowflake schema) , that aim at boosting the performances of analytical queries. A simpler approach is to is to replicate the OLTP database for reporting purpose. While this may have performance limits for analytics due to sub-optimal data structures, it is relatively inexpensive (especially if you already have the dashboards), and somewhat scalable, as you could envisage more replicas (ideally feeding the additional replicas from a first replica to reduce OLPT resource consumption to the minimum for replication).
Data warehouses are fed periodically (every night, every fours hours, every hour), preferably loading incrementally new/changed data instead of a full load. But real-time duplication is also possible.
Another approach is to use the CQRS architecture. But in your case, this would require substantial reengineering, as you'd have to feed in real-time a second data model that would be optimised for the dashboards. And as you may very often enrich your dashboard sets, this is not necessarily the best option for you.