Publication events must be initiated from the Blueprint Analyst. This is done automatically, once per day, per the publication settings schedule, or manually through use of the Blueprint Administrator application. Publication is not a simple SQL task; it is a 'chained' event that requires interaction between the Blueprint Analyst and the SQL database.
Publication does not need to occur more than once per day, especially given that the fact that downstream collection agents, e.g. Print Scouts, Collectors, etc., only upload data once per day to the Analyst. If you are looking to make sure that today's publication data is an accurate reflection of yesterday's activity, you will want to arrange upload times to follow a very logical flow. Print Scouts should be configured to upload data throughout the course of the business day. Collectors should upload data in the evening after normal business hours. Publication can then be scheduled to begin early in the morning to aggregate all the data received the previous day.
On a final note, you really don't want to publish data too frequently, especially during normal business hours, when its operational overhead on both the SQL server and Analyst server may create observable delay in other operations, for example, secure print operations. Too frequent publication may also have the undesired effect of creating fragmentation within the database, causing queries to take incrementally longer to execute.
In summary, Blueprint is not intended to be a real-time reporting tool. When schedules are configured properly, Blueprint does provide very accurate reporting for activity that occurred one day prior. In other words, reports run today can accurately reflect the activity that occurred yesterday.
Can you help to explain the challenge you are trying to solve? Perhaps we may be able to offer a better approach that does not require publication.
Thank you for such detailed information!
Too bad Blueprint is not a real-time SW. Your competitors offer nice looking features that allow for instant monitoring.
Used to work with IBM Lotus\Domino I see many pros of the way you handle data (whole Analyst\Collector thing), but when Blueprint is displayed in demo rooms it looks like an old disco phone next to the modern smartphone.
You have not provided the "what problem are you trying to solve?" answer that Tim (and I) were looking for. Real-time data collection is not typically necessary when the objective is cost reduction, cost recovery (billing back to individuals or departments), data exposure (due to documents just left in output trays or inadvertently picked up by the wrong individual), or general device access control. I have been in the Document Accounting space (as a consumer of said data, as a reseller of said data, and now as a provider of said data) for 18 years, across many different solutions (not just Pharos), and if a customer is attempting to manage any of the previously-stated objectives in a "just in time" model, then they are trying to manage the wrong thing and will be soon frustrated by any software package. However, if the customer wants immediate data for forensic/security purposes, the "just in time" model is appropriate. But in truth, no document accounting solution (no matter how "disco" or "smart") is really designed for that type of need; a data loss prevention (DLP) package will better fit that bill because they can also capture the document contents, whereas accounting software just captures the document name.
That being said, there are ways to hasten the in-built process engendered by the Blueprint development team. They are:
- Force immediate upload of Print Scout data. Use the Print Scout Settings options to make the job batch "1" and force upload when the batch limit is reached. At the same time, you can expand the upload window to be 12:00 am computer local time through 11:59 pm computer local time.
- Use the Collector's "Manual Batch Send" option. On the Collector(s) hosting Print Scouts, launch Blueprint Administrator and go to Collector > Statistics. Once there, click the Export button in the top right corner. This causes the Collector to gather its current batch file together and ready it for sending. Once the export item appears in the panel, click the Transfer All button to send it to the Analyst. Once at the Analyst it is imported.
- Manually publish the data. On the Analyst (or a remote Blueprint Administrator installation) go to Reporting > Publications and Publish to Data Warehouse. Choose the "...equivalent of the nightly analysis" option and spark it off. Since the default republication period is 7 days, you may want to reduce that prior to starting the manual publication, to the minimum 4 day publication range.
Again, a proper vetting of the customer need is the appropriate first step in determining how best to approach this topic. Often, a "just in time" request comes after the customer sees the type (and ways) we represent captured data. I was, just recently, involved in a customer interaction where "something bad" potentially happened regarding print, and we were able to quickly (within an hour) get the data needed to complete the investigation. I look forward to hearing more about your customers' needs and how we can help.
Thank you, Scott for comments on real-time vs decentralized systems pros and cons!
Without getting into lengthy and mostly unnecessary existential philosophy on how the choice is being made, let me just say - this is not a problem any customer has. It is just that in a demo room where an all-in-one hardware\software solution is presented there is no wow factor when it comes to Blueprint. Or may be I am not aware about any . Yet other products brag their real time graphs that make them stand out. Again, this is not a crucial feature for the final decision but it gets word around.
ps. all I asked for was a way to start publication process with ps