Sunday, May 31, 2009

Lent my iphone to my sister, nephew threw hers down the toilet, literally... Ouch!

Saturday, May 30, 2009

designing new CDISC Builder logo by combining two clipart as a mashup. Do I need to get permisions? http://ping.fm/4PWq9

Thursday, May 28, 2009

Did not know SAS gives out real awards when you submit a paper... very nice :) http://ping.fm/T33zC
Implementing Serious AE Software for client Ontario CA and visit in Sept, first time, is it cold? http://ping.fm/NSpgw

Wednesday, May 27, 2009

Defining architecture of how an iPhone application can access SAS data and programs. http://ping.fm/tlwy2
Just joined CDISC Metadata Team, subteam of SDS, yahoo! Looking forward to Monday's meeting. http://ping.fm/Ir7nW

Tuesday, May 26, 2009

Attended my first CDISC SDS team meeting. Just the new kid on the block http://ping.fm/pmJSI

Sunday, May 24, 2009

just invited to join CDISC SDS team to work with SDTM and related submission models, very exciting... http://ping.fm/S8lfI
ah... my mistake. scheduled to teach CDISC class for pre-conference WUSS on Sunday, August 30th. http://ping.fm/9VLQk

Saturday, May 23, 2009

am scheduled to teach CDISC class at WUSS San Jose on Sunday, Aug 20 :) http://ping.fm/7Vtts
working on iPhone App to access SAS server and user manual, I should get a life... http://ping.fm/YCRQj

Monday, May 11, 2009

Data Integrity through DEFINE.XML

You can use the DEFINE.PDF/DEINF.XML files that are created for electronic submission to review your own data. This is usually performed by an independent reviewer outside of the development team. The fresh perspective from the reviewer creates a redundancy that ensures the accuracy and integrity of your data. This would allow you to catch discrepancies that may otherwise be captured during a review from regulatory agencies. There are steps which you can perform to ensure that your domain documentation is accurate and that the data which it is describing is accurate.

Step 1: Verify that any hyperlinks such as the one to external transport (XPT) files link to the right files. This is to ensure that the domain document itself has accurate hyperlinks.

Step 2: At the top list for datasets, verify the key fields. Ensure that the following criteria are met:

  1. The key field exists and is listed first in the list of variables.

  2. The dataset is sorted by the key fields.

Step 3: Verify all decoded formats. Verify that the values of the decodes match what was defined in the analysis plan or original case report form. Review the data to see if there are any values that do not meet the formatted codes and therefore was not properly de-coded.


Step 4: All derived variables need to be verified. You can choose to do some or all of the following recommended verification tasks to ensure the integrity of the derived variables:

Code Review

Systematic review of program code pertaining to the derivation according to a predetermined checklist of verification criteria.

Code Testing

Perform testing on SAS programs pertaining to the derivation supplying valid and invalid inputs and verify expected output.

Log Evaluation

Evaluate the SAS log for error, warning and other unexpected messages.

Output Review

Visual or programmatic review of report outputs related to the derivation as compared to expected results.

Data Review

Review attributes and contents of output data for accuracy and integrity.

Duplicate Programming

Independent programming to produce the same derivation and output for comparison.



There are many tasks performed in the process of verifying and validating SAS programs to ensure the quality of your data. Many of these tasks are overlooked for their significance in maintaining accuracy and integrity of the program logic and output which it produces. The repetitive aspect of these tasks gives them a bad reputation of being unglamorized grunt work that must be done to meet departmental SOPs. However, it is an essential step which can be performed and directed by what is documented in the domain documentation.


File Formats
The domain documentation originally was specified to be a DEFINE.PDF file. The PDF format is good in that it is not intended to be edited and can be viewed both on screen and on printed paper on many computing platforms. This is a good file format for the final electronic submission but if it is used for other purposes, other formats may be more suitable. PDF does have limitations in that it is not extensible. You cannot add extra information. For example, if you wanted to store information about the user name and date and time as to when they have last updated a particular variable, you cannot easily do this within the current DEFINE.PDF. However, XML file format is extensible which allows you to add more information. The new standard therefore is calling for the documentation to be stored in a more vendor neutral, universal and extensible structure of DEFINE.XML.


If you are to use the domain documentation for project management, the information can be stored in either an Excel spreadsheet or a Word document. For electronic submissions, the XML or DEFIND.PDF is a better choice. There may be slight variations but the main core information is the same for these files.
Since the content of the data is similar, file format becomes less significant. The file can be converted from one format to the other while maintaining all the same information. There are tools that will make this a transparent process. The goal is to make use of the information stored within the domain documentation and not be restricted by being forced to use one particular file format.

complete paper found at " Data Integrity through DEFINE.PDF and DEFINE.XML " and related DEFINE.XML Software.

Bookmark and Share

Tuesday, May 5, 2009

Validation of the SAS System

Validating a new version of SAS on a production server used to be a daunting task. The SAS System version 9.1.3 and 9.2 ships with user friendly installation qualification tools. This is coupled with existing tools that make it easier to validate SAS. Besides qualifying the installation, there are other tasks and components of the system that need to be validated or verified. Some of these components include:
  1. Backward compatibility issues with older versions of datasets and format catalogs.

  2. Validating multi use macros and standardized code templates.

  3. Verifying stand alone or project specific programming and output.

  4. Effects on standard operating procedures and programming practices.

The interconnectedness of the SAS computing environment does require considerable efforts in validating a SAS System. However, if this is executed successfully, it can allow for greater traceability between output, programs and source data. The performance qualification also sheds light on ways of optimizing the work and data flow of your computing environment. The many benefits of performing validation of the SAS System will outweigh the costs. In addition, it is a requirement within a regulated environment so it is recommended to be prepared.

Validation Overview
Validation of a SAS system most commonly occurs during an upgrade from an older version of SAS or moving to a new platform. The examples used in this paper include migrating from SAS 8.2 to SAS 9.1.3 and moving from a legacy operating system to the windows platform. In either case, similar validation challenges are confronted. It is recommended that you first acquire a global view of the system and identify the architecture. Only after gaining this perspective would it be useful to then zoom in on individual components. This allows you to access the scope and interconnectedness of each component so that your validation efforts are balanced and thorough. Once the architecture is clearly understood, the requirements and functional specifications of each component are documented. These functional specifications then drive the validation testing.
It is important to follow these steps in a systematic and orderly fashion since they are interdependent. Documentation of each step in the validation process is also essential in capturing and proving that the validation effort was done properly. Besides documenting each step, it is also important to capture the traceability of each validation task. For each test case that is performed, there is an associated functional specification which then is connected to the requirements for a particular component of the system as a whole. The map or traceability matrix that ties all these validation components together is pivotal to an auditor. Proper documentation will make the difference between a successful validation audit and a complete failure.

The main goal of the validation effort is to ensure that the installation and implementation of the SAS system and its associated tools function as intended by the vendor (SAS Institute) and your organization. The validation will ensure this success. In addition to this goal, the documentation of your validation effort will also ensure the integrity of your computing environment and be in compliance with regulatory requirements such as the CFR Part 11 within the biotechnology and pharmaceutical industry.

System Components
The first step in your validation effort is to understand what it is that you are working with. The SAS System, as delivered to you in a series of CDs, is a system which contains modules such as Base, Stat, Graph and other components of SAS. This however only makes up part of the system that you are implementing in your organization. The SAS software fits into a computing environment that interacts with other software and hardware. If you were to take into account all the associated hardware and software that SAS interacts with, this is what is considered the “SAS System” from a validation perspective. It is therefore important for you to take the right steps to identifying and documenting all these components.

Step 1: Identify all the hardware components of your computing environment. For example:

Hardware Component

Name

SAS Application Server

SASAPPSRV

SAS File Server

SASFILESRV

Client Desktops

CLIENTDSK


complete paper at "Validating Your SAS System " and related SAS Validation...


Bookmark and Share

Monday, May 4, 2009

Bringing SAS ODS Output to the Web

The SAS® System gives you the ability to create a wide range of web-ready reports. This paper walks through a series of examples showing what you can do with Base SAS and when you need SAS/IntrNet. Starting with the simplest HTML reports, this paper shows how you can jazz up your output by using the STYLE= options, traffic-lighting and hyper-linking available in the reporting procedures: PRINT, REPORT, and TABULATE. With the use of SAS/IntrNet you can add functionality to reports with features such as drill-down links that are data-driven, and you can produce dynamic reports created on-the-fly for individual users. Using this technique, your clients can navigate to the exact information needed to fulfill your business objective.

SAS ODS Overview
In the SAS System, both the Output Delivery System (ODS) and SAS/IntrNet produce documents for viewing over the Internet. All SAS users have ODS because it is part of Base SAS, but SAS/IntrNet is a separate product which you must have installed in addition to Base SAS.

If all you want to do is produce reports and post them on the Internet for people to view, then you probably don't need SAS/IntrNet. With a few ODS statements you can send any SAS output to the HTML (Hyper Text Markup Language) destination. You can also change the way HTML output looks by choosing one of the built-in style definitions that comes with Base SAS, or creating your own style definition using the TEMPLATE procedure. (Unfortunately, we don't have room to cover PROC TEMPLATE in this paper. For more information on PROC TEMPLATE or ODS basic concepts, see Slaughter and Delwiche (2001).) Using the STYLE= option in the TABULATE, REPORT, and PRINT procedures, you can change the color, font, and many other features of reports. You can even insert images and hyperlinks.

Using ODS to insert hyperlinks, you can create a pseudo-dynamic effect. When a person clicks on one of these hyperlinks, the browser takes them to a new page. While this has a dynamic feel, the new page is in fact static because you have created it in advance. To create truly dynamic reports, you need SAS/IntrNet.

With SAS/IntrNet you can create reports on the fly, based on the needs of individual users. The advantage of combining SAS/IntrNet with your ODS programs is that your program is dynamically executed when the user clicks on your hyperlink. This means that if your data is changing, the drill-down will capture the most up-to-date results. If your reports are not time dependent, the static approaches of generating HTML reports with ODS will suffice. However, if your report helps decision makers decide upon time-sensitive information, the marriage of ODS and SAS/IntrNet is the perfect solution.

Basic Table
For the first part of this paper we will be using basically the same table produced from PROC TABULATE to show you how you can use ODS and the various STYLE options to modify the look of the table. Here are the SAS statements that produce this basic table and the listing output is shown in Table 2.


PROC

FORMAT;
VALUE $flav
'P' = 'Pecan Pie'
'B' = 'Banana Bash'
'A' = 'Apple Spice'
'M'
= 'Mango'
'C' = 'Choco Mint';
TITLE 'Jelly Bean Production in 2001';
TITLE2 'Millions of Pounds';
run;

PROC

TABULATE DATA=production FORMAT=4.1;
CLASS Factory Flavor;
VAR MPounds;
FORMAT Flavor $flav.;
TABLE Flavor ALL,
Factory*SUM=
''*MPounds=''
ALL*SUM=''*MPounds='';
RUN;

complete paper found at "ODS Meets SAS/IntrNet?", related CDISC Software and CDISC Standards...
Bookmark and Share

Friday, May 1, 2009

Effective Ways to Manage Coding Dictionaries

Coding dictionaries such as MedDRA and WHO Drug can be a challenge to manage with new versions and change control. This becomes even more difficult when different collaborators such as CROs deliver coded data with different coding decisions from various dictionaries. Reconciling these differences can prove to be very resource intensive. This paper will address these challenges and suggest techniques and tools which compare and report on the differences among dictionaries. It will demonstrate strategies on reconciling and managing changes for consistent coding of adverse events, concomitant and medical history.

Controlled Terminology Overview
Coding decisions for adverse events and medications is part science and part art. There is room for interpretation left up to the person deciding on which preferred term or hierarchical System Organ Class (SOC) is associated with the verbatim term. This may differ slightly between projects with different drugs and indications. The difference in coding decisions is compounded when there is more than one person making the decision. This is even further exacerbated when the individuals work in different organizations such as various CROs with different operating procedures. There are many variables contributing to different coding decisions which create a challenge for the data manager who needs to pull all these coding decisions into one coherent and consistent set of coded data for analysis and submission. This paper will describe an approach to manage and reconcile these differences referred to as “ThesQA” or Thesaurus Quality Assurance. The workflow of this methodology is shown here:



The first and pivotal step in the work flow is to be able to manage all the dictionaries centrally by registering them. This is also referred to as “Setup”. Setup gives you the ability to track change control and manage the metadata pertaining to each dictionary. Once you have identified all the versions of dictionaries and their related coding decisions and store the information centrally, you can start to work towards reviewing and reconciling their differences. The goal is to manage all the changes while maintaining change control that takes place during updates.



Dictionary Setup and Management
The first step in managing your dictionary is to manage the metadata pertaining to each set of data. The metadata is stored in a SAS dataset so that it can be easily updated by SAS tools. An example view of the data would look like:


The SAS dataset named DICTDB, which stands for dictionary database, does not contain the actual values of the dictionary, but rather it captures information about each thesaurus dictionary to be managed. The following steps describe the approach towards setting up the dictionaries.

complete paper at: "Effective Ways to Manage Thesaurus Dictionaries", AE Coding Software and Coding Dictionary.


Bookmark and Share