Jump to content
Larry Ullman's Book Forums

earflappin

Members
  • Posts

    1
  • Joined

  • Last visited

earflappin's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. Larry, I am loving your book. I learned PHP on my own a couple of years ago and have been quite productive, but never felt I had a fully grasp of OOP. Your book quickly helped me understand the core concepts. Thanks! So here is my question. I have an application where a user is allowed to configure complex queries using a point and click configurator. This eliminates the need for the user to understand SQL and allows us to keep the underlying schema confidential. The user input is used in the backend to generate the actual MySQL query. This can involve multiple table joins. The output can be: (1) downloaded to the browser as a csv file, (2) sent to a sFTP or CIFS mount as a csv file or (3) sent to a remote database. Before the data can be exported, various manipulations are required to be performed on the data. Some of these queries can generate very large data sets which would preclude holding them in memory. So my question is this - what is considered PHP best practice for this type of use case? Should the large data set be stored as a file using a single query, after which it can be manipulated and then exported in the appropriate format? Or should a temp database table be used? Using a slice technique (i.e. where the query is run in batches) seems problematic to me as it seems expensive versus a single query. Thanks for your insights on this.
×
×
  • Create New...