Friday, December 17, 2010

2 bits 4 bits 64 bits a dollar

The good news: "Mr Newby your new 64 bit Windows-7 8 gig package is ready for you."
The bad news: "Program Files or Program Files (x86)" and "How about that WOW6432Node Eh!"
Sooooooo, I've been installing and uninstalling and tweaking vbproj files and futsing with Visual Build scripts and messing about with InstallShield 2010 scripts and repeating, ad naseum.
But if you are a developer loading up Manage 2000 7.3 sp2 web tools on 64 bit Windows 7 box and you should run into issues, be thankful, it could be much much worse. The sp2 installs are now much more solid.
The most likely annoyance will be dataset references pointing into "Program Files" that can't be resolved because the dataset libraries are now in "Program Files (x86)". You can, of course, simply delete them and then re-add them from the correct location on your box.
Or you can copy the typed library en-masse from Program Files (x86) to Program Files. You just need to remember to copy over dlls anytime you regenerate them through the dataset utility.
Another option is to use the Path Conversion Utility to do a mass change on all the vbproj files in a Manage 2000 web site. This makes sense if all of your production and development web servers are 64bit.
I have also added a PreBuild.cmd script at the Manage2000.vbproj level that you can add under advanced compile options "..\..\..\PreBuild.cmd" for a more surgical approach. As you build projects it will convert the references from hardcoded to "$(ProgramFiles)" which will load and build in either 64 or 32 bit environment. This will prompt you to reload the project during the build if it finds references that must be changed. Otherwise it will let the build continue normally.

Saturday, November 6, 2010

Manage 2000 7.4 Web - The Beginnings

Manage 2000 7.3 field release is well underway and I have finally got some of the bug reports I have been begging for since Beta. Yes, be careful what you wish for. But each one is a treasure that will make 7.3 sp2 a better more solid product when it launches in Q1 of 2011.

Perspectives 2010 is done and now I can turn my attentions to some serious development work for Manage 2000 7.4. My first major task is to build a foundation for web update functions that will be as nice for the keyboard oriented users as PWS functions and that will be as quick to build for the developer.

There have been 4 or 5 update web functions in Manage 2000 since release 7.1 and the BTO framework has always supported update business object processes. But is is not an easy task to build an update web function and the user experience falters particularly for keyboard oriented folks in scrolled set scenarios.

It is remarkable how effective one level of de-normalization in the user interface can be. When you look around at the thousands of Manage 2000 PWS functions almost all of the screens are a collection of single valued prompts with a few scrolled sets. The application developer can express the modeling of complex business processes quite efficiently with this one level de-normalization, taking the user through multiple screens/pages traversing deeper business processing levels. And I wonder if this does not reflect a characteristic of human perception to be interested in a set of data and its immediate context, but no further.

Using grids to implement scrolled sets is like converting an all-terrain vehicle into a riding lawn mower; sure it can be done, but is it the most practical approach? The programming model is too complex and the performance too poor. A much leaner model that only does what is required for scrolled set entry and does not try to be a generalized do everything-2-dimensional container is needed.

So here I am re-casting the SCROLL.MAINT logic Doug wrote 20 years ago into an HTML/JavaScript mold. Am I going backwards, or am I bringing those things that worked and added value into the Dot World?

Tuesday, September 21, 2010

REST Web Services from UniBasic

In the midst of working up some labs for Perspectives I started thinking about REST web service access from UniBasic. The lab I happened to be working-up is on SOAP based RPC web service usage, for which we have a number of working examples and implemented projects.

But in creating services for Manage 2000 internal consumption I have all but abandoned SOAP RPC in favor of the far more elegant JSON REST model. I am usually working in ASP.NET and IE DOM client land, thankfully with the prototype.js library. So the question of the day was what would REST JSON access look like from UniBasic and what are the central issues in using it?

Well in addition to all the SOAPRequest support added with the UniBasic Extensions are a couple of simple little commands for retrieving the content from a URL. Now URLs often return HTML intended for human consumption and are not very easy to parse. The JSON REST model however returns very nice orderly string serializations.

With something as simple as:

CREATE.REQUEST.RTN.CODE = createRequest(URL:"?":QS,HTTP.METHOD,HTTP.REQUEST.HANDLE)
SUBMIT.REQUEST.RTN.CODE = submitRequest(HTTP.REQUEST.HANDLE,'','',HTTP.RESPONSE.HEADERS,HTTP.RESPONSE.DATA,HTTP.RESPONSE.STATUS)

you can get orderly responses like:

HTTP.RESPONSE.DATA = {'oValItem':{'FileName':'CM', 'TableNbr':'', 'ItemId':'1024
', 'NewItemId':'1024', 'Valid':'True', 'Display':'Sears Systems, Incorporated',
'ErrorFlag':'False', 'ErrorMsg':''}}

UniData does not yet include a JSON DOM to match the XML DOM of the UniBasic Extensions, but the parsing difficulties of JSON serializations are relatively minor. And if you need to directly access JSON REST services in the middle of UniBasic code this seems like nice direct approach.

The real stumbling blocks aren't coding difficulties, but security of your application server if you open up the HTTP or HTTPS ports for UniBasic to "see" services on the Internet. One answer to this is to go through a proxy server. This also points out one of the benefits of the architecture of Manage 2000 with its ASP.NET arm which may be used in a similar manner to delegate interaction with the messy world away from your closely guarded business database into a DMZ area.

Thursday, September 2, 2010

Dynamic Dropdowns and TM JSON arrays

Whether you call it RIA or Web 2.0 or AJAX enabled, there is an evolutionary process taking place on the web these days with UIs morphing from BLOCK-TERM like full page postbacks to much more granular dynamic changes in the page as the user interacts with it. And you can certainly see the effects of this progression in Manage 2000 release 7.3.

A developer friend asked if I preferred coding in VB in the code-behind ("code-beside" in the new parlance) or on the client using javascript and such. This is one of those questions that makes my brain churn for awhile.

The central conclusion I finally came to was that if it makes the UI more convenient for the user, that I dynamically change control configurations (like reloading drop down options based on previous answers they've selected) as they progress through a page then I prefer taking manual control on the client using javascript. I still use code-behind and aspx templates to push the HTML out in the first place, but then shift to javascript-AJAX-JSON-DHTML to make the user interface reactive and dynamic so that it responds more intelligently to the conversation that the user is having with it without having to pause and reprocess major Page construction code.

So how does one go about dynamically loading an HTML SELECT with options from a Manage 2000 TM table?

I have added a new service called GetTMTable in /mt/JSONServices for just this purpose, using much of the same code as my last post:

Private Function GetTMTableAsJSONArray(ByVal context As HttpContext) As System.Text.StringBuilder
Dim TableNbr As String = context.Request.QueryString("TableNbr")
Dim result As New System.Text.StringBuilder
Dim arTableEntries As New System.Collections.Generic.List(Of Array)
Dim ds As New ROISystems.Components.roiDataSet
Dim TableMaster As New ROISystems.WebControls.roiTableMaster
ds = TableMaster.GetTable(TableNbr)
For Each entry As DataRow In ds.Tables("VALIDATION_Validation_Info").Rows
Dim row() As String = {entry.Item("Code"), entry.Item("Desc")}
arTableEntries.Add(row)
Next
Dim JSONSerializer As New System.Web.Script.Serialization.JavaScriptSerializer
result.Append(JSONSerializer.Serialize(arTableEntries))
Return result
End Function

Here is the js portion of my testcode.
function LoadTable() {
var Site = document.location.pathname.Field('/', 2, 1);
var svcUrl = document.location.protocol
+ '//' + document.location.host
+ '/' + Site + '/MT/JSONServices/GetTMTable.ashx';
var TableNbr = $F('TableNbr');
var qs = 'TableNbr=' + TableNbr + '&Cid=' + $F('hedtcid');
new Ajax.Request(svcUrl + '?' + qs, {
method: 'get', asynchronous: false,
onSuccess: function(transport) {
var arTableEntries = transport.responseJSON;
// clear and reload the dropdown with the new table
$('ddlTMTable').options.length = 0
$A(arTableEntries.each(function(item) {
var opt = document.createElement('option');
opt.text = item[1];
opt.value = item[0];
$('ddlTMTable').options.add(opt);
}))
}
});
}


The result is blindingly fast reloads of the dropdown list from various tables.

When you do finally postback you will run into some MS security checking unless you disable event checking in the page declaration in the aspx file, or in a web config setting:
@ Page EnableEventValidation="false"
or
pages enableEventValidation="false"

You may also run into occasional confusion during postback on the part of webcontrols code trying to figure out why what is coming back doesn't match what was sent out. To avoid this confusion you can either just use plain ol HTML controls or check the Request.Form("lbID") array directly if it gets to be a problem.

Unfortunately I had to change the roiTableMaster control to remove a dependency on roiPage so that it would work out of JSONServices. This makes it difficult to patch, but it is all better for 7.3 sp2.

In the mean time you could, of course, implement a version of /mt/JSONServices that descends from roiPage rather than IhttpHandler, it just would have all the application overhead that roiPage carries around.

Wednesday, July 21, 2010

JSON Conversions

So, I am integrating a provided web page into a Manage 2000 site and I need to supply this external page with a JSON array of data on the querystring based on the contents of a Manage 2000 TM Table.

How to get a JSON serialization out to the client world?

There is a very nice little namespace that I have not previously run across, System.Web.Script.Serialization. And in it you will find a JavaScriptSerializer class (read JSON serializer!).

With the JavaScriptSerializer you can convert a .Net Hash to or from a JSON object, or a .Net System.Array to or from a JSON array, or a bunch of other mappings including your own.

In my case I want to end up with a JSON array of elements with each element comprised of an array of code description pairs.

Private Function GetTMTableAsJSONArray(ByVal TableNbr As String) As System.Text.StringBuilder
Dim result As New System.Text.StringBuilder
Dim TM As New System.Collections.Generic.List(Of Array)
Dim ds As New ROISystems.Components.roiDataSet
ds = TableMaster.GetTable(TableNbr)
For Each entry As DataRow In ds.Tables("VALIDATION_Validation_Info").Rows
Dim row() As String = {"", ""}
row(0) = entry.Item("Code")
row(1) = entry.Item("Desc")
TM.Add(row)
Next
Dim JSONSerializer As New System.Web.Script.Serialization.JavaScriptSerializer
result.Append(JSONSerializer.Serialize(TM))
Return result

End Function

Yes, the JavaScriptSerializer is my new favorite toy for transforming data during client side AJAX activity.

Friday, July 9, 2010

Hyper Activity

A recent treasure from our 1st live Manage 2000 7.3 site led me back to researching performance issues in the Pegged Detail page of ItemActivity. This has been a long standing issue in the field for certain customers on certain parts under certain circumstances.

Previous optimizations have included converting dynamic arrays to dimensioned arrays when handling the TD_ITEM_PEG_ACT_RESULT file items. But even with attributes stored in separate dimensioned elements the immense number of values that may be generated in the real world overwhelm the UniData box and then moving the gigantic business object to the web server overwhelms that box. The resulting user experience "just sucks".

The real detail work of analyzing pegged detail is done in SUB.BUILD.ITEM.ACT.PEGGED, and entails building a number of attributes for each detail and sorting by date and by a peculiar transaction type order. And there is no good way of separating the selection and sorting of keys from generation of detail data as described in the PAGED.BTO document because the whole point is to keep a running total of availability and a number of calculated and summarized values.

To meet these requirements and scale to many thousands of lines of detail I created a process work file keyed by date and by sequence number and wrote simple flat records for each detail. Then SSELECT the work file and update the small simple records with the running total fields. This replaces a LOCATE and INSERT loop that hockey sticks as the the value count exceeds a thousand. And finally write the results in pages to the WEB_COOKIE_DATA file where each page record only contains, say 25 (users current items per page preference) values for each attribute.

The business object returns only the 1st page of pegged detail and the cookie where the rest of the pages may be found. The web page may then use another business object to read any page of the result the user wishes to view.

Where the business object is handling 3000-4000 details the whole-view multi-valued set approach was taking 6+ seconds on a modern Itanium UNIX box. The work file approach reduced this to less than 1/2 second. And testing shows a performance curve up to 20,000 details running at about 7,000 details per second on our development box.

On the web side the performance improvement is even more dramatic as the dataset and grid processing on large local datasets just overwhelms the web server processor. Eliminating the large business object result, and asking the web server to only create objects representing a single page of the pegged detail, results in near instantaneous page changes even with the trip back to the application server to pick up the page data.

Conventional wisdom says memory is fast, disk is slow. But in this particular scenario it is much more efficient to create a work file, populate it, SSELECT it and work with small flat dynamic arrays than to follow the standard path and attempt in memory sorting of large deep dynamic arrays.

Thursday, June 3, 2010

Sub Valued Level Prompting Details

How do you fixup the screen and do cross referencing and other validations when you use SUB.VALUE.PROMPTING on a SUB.MT500 screen?

Here is an example where we want to prompt for multiple file names and within each file for multiple item ids.

Instead of calling SUB.VALUE.PROMPTING directly from the SCREEN, call your subroutine which in turn will call SUB.VALUE.PROMPTING.
















Remember to add right justified fields using the PWS screen in SCREEN.BUILD if you want the count of sub-valued items to line up nicely.


You can add logic to execute only at the sub-valued level by checking X_Data_2 for the FROM.SCROLL.MAINT flag.

*

2200* Before prompt logic for Item_Attachments

*

IF INDEX(Prompt.X_Data_2, 'FROM.SCROLL.MAINT', 1) > 0 THEN

* Handle inner event from SUB.VALUE.PROMPTING execution of SCROLL.MAINT

* Reset SIP.VAL.FILE to enable cross referencing

FILE.VALUE = FIELD(Prompt.Display_Text_2, ",", 2)

FILE.LIST = MAIL.FILE.File_Attachments

CURRENT.FILE = FILE.LIST<1,FILE.VALUE>

CALL GET.DB.FILE(SIP.VAL.FILE, CURRENT.FILE)

ERROR = 0

RETURN

END

* Handle MAILBOX.ATT prompt event

GOSUB 2210;* Fixup Prompt Label with SUB.VALUE level display text

* Setup XREF stuff

Prompt.Conversions_Edits = '0X':@SVM:Prompt.Conversions_Edits

* remove compiled edits to force recompile

Prompt.Conversions_Edits = FIELD(Prompt.Conversions_Edits,roiDataMark4,1)

CALL GET.DB.FILE(SIP.VAL.FILE, CURRENT.FILE)

CALL SUB.VALUE.PROMPTING(ANSWER,SUB.DATA,P.NBR,PMT,SAVE.FN,VALUE)

END

*

RETURN

If you want to fix up details like the screen labels and prompt text while at the sub-valued level you can do this sort thing:

*

2210* Display SubValue Level Label Text

*

SET.NBR = Prompt.Scroll_Field_Type[2,2]

LOCATE SET.NBR IN SCROLL.DATA<9,1> SETTING PRIMARY.IDX ELSE RETURN

PRIMARY.PMT.NBR = SCROLL.DATA<1,PRIMARY.IDX>

PRIMARY.PMT = PID(PRIMARY.PMT.NBR)

STARTING.ROW = PRIMARY.PMT<1,7>

HEAD.ROW = STARTING.ROW-1

DISP.MASK = "L#":Prompt.Display_Length

ID.LABEL = XLATE("DICT ":CURRENT.FILE, "F0", 61, "X")

IF ID.LABEL = "" THEN ID.LABEL = "Item Id"

LABEL.TEXT = FMT(P.NBR,"2\0R"):".":VALUE:' ':ID.LABEL

Prompt.Text = "Enter ":ID.LABEL

REDIS.MISC<10> = L(HEAD.ROW):C(PMT<1,6>):LABEL.TEXT DISP.MASK

PRINT REDIS.MISC<10>:

RETURN

For Validation after prompting you can check the X_DATA_2 flag again and if you are at the multi-valued level simply request a repaint with ERROR=1000, but at the sub-valued level actually carry out input validations programmatically. Remember that your sub-valued level validation logic is being called from SCROLL.MAINT so do not use ERROR codes like 200, 1000 as you would for SUB.MT500, simply 0 or 1.

*

3200* After prompt logic for Item_Attachments

*

IF INDEX(Prompt.X_Data_2, 'FROM.SCROLL.MAINT', 1) = 0 THEN

* Remove SV Label Text

REDIS.MISC<10> = ''

ERROR = 1000

END ELSE

IF NOT(SIPDATA) THEN RETURN

FILE.VALUE = FIELD(Prompt.Display_Text_2, ",", 2)

FILE.LIST = MAIL.FILE.File_Attachments

CURRENT.FILE = FILE.LIST<1,FILE.VALUE>

REC = XLATE(CURRENT.FILE, ANSWER, -1, 'X')

IF REC = '' THEN

* Message 502: %1 item %2 not on file

CALL SCREEN.MSG(GetMessageText(502,CURRENT.FILE:@VM:ANSWER,0):";H;#M446065")

ERROR = 1

END

END

RETURN

Monday, April 26, 2010

7.3 sp1 GA Cutoff Week

Well midnight Thursday is the end of the line for 7.3 sp1 development. I've been cleaning and testing and sweeping up crumbs. My latest clean-up binge has been the DEMO.DATA. The lack of examples of web feeds has been bothering me for the last couple of years, so I bit the bullet and started creating a series of example web feeds for customers, and sales reps, and executives. It's not the end-all and be-all of demo material, but at least it will be there in the SSP account. And it has brought out a number crumbs to clean up :)

I've got one or two more bugs (legitimately called treasures if I can fix them prior to them getting to the field) to clean up before the Thursday cutoff. But I am feeling really good about the Manage-2000 7.3 release. It just has a ton of good stuff.

Wednesday, March 17, 2010

The Wonderful World of Wizards

It has been an exhilarating and somewhat exhausting spring here in Minneapolis; from 2+ feet of snow pack to clear yards and 60 degree sunshine in 2-3 weeks. Getting beta's underway has not been fast enough or clean enough for my impatient expectations. But that's why you have betas to find the stumbling blocks. While waiting to enhance as-yet unidentified pre-enhancement conditions, I have been working on a pet project to create web function wizards with more specific generation capabilities.

My first re-visitation to IWizard has resulted in a modest little wizard that will help you generate a HyperQuery web function. The HyperQuery control allows configuration of a REPORT.BUILD like query based web function.

The second undertaking turned out to be a lot more interesting and a lot more work. The BTO Inquiry Wizard allows you to select a business object, select from its available fields and generate a working inquiry with all the data access components and controls configured, and with a FormView containing single valued fields labels and textboxes, and GridView controls for each set.

The great part about wizards is, of course, that you can take the results and enhance the heck out of them. They provide RAD starts to developing your own web functions while still leaving you in total control.