The problem with the term "cloud-based grid computing" is that it really conflates two different terms that IT people often differentiate from one another in specific ways.
In common usage, the terms "cloud" and "grid" are seen as two separate and competing ideas. For example, the term "pre-cloud grid" refers to the trend during the 1990s and early millennial years when IT operations started to move away from big mainframe systems by linking together large sets of Internet-connected or LAN-connected computers. Grid computing in general involves using these distributed pieces of hardware to pursue common goals that require a high degree of processing and computing — think in terms of large statistical research projects or the real-time collection of complex data.
Some take the juxtaposition of grid and cloud a step further and talk about grid vs. cloud and the benefits and disadvantages of each. In general, the consensus is that grid computing as a whole can offer high performance, but public cloud computing or other cloud computing systems can offer more scalability, which is the ability to anticipate changes in demand. Some IT experts talk about how the cloud has eclipsed the grid because, instead of building out hardware systems, clients can just order on-demand systems from cloud providers.
Technically, cloud-based grid computing may feature a number of hardware elements working together on large projects, on a public cloud platform. It is also worth noting that many people who talk about cloud vs. grid note how grid computing is often done by large institutions or government agencies, where cloud computing is a go-to method for the average enterprise.