Friday, October 31, 2008

Video Blooging a tool for for development

By Brenda Zulu

Creating local language video with a translation in the national language text is important for increasing participation and sharing observed Prince Deh, Ghana Information Network for Knowledge Sharing (GINKS) Assistant Country Director.
Talking about knowledge sharing Web 2.0 participatory tools called video blogging or vlogging, Deh observed that local language was important and this was an area which GINKS was going to explore as all their videos were in English. The term Video blogging or Vlogging may mean making videos and posting them on the Internet with the intention of getting a response from viewers.

Vlogging major challenges were listed as connectivity or access and getting people to share Information and Knowledge and cost of equipment.
From my his own view, Deh said Web 2.0 tools were important and even more important because of the deeper impact the tools would have on marginalized societies, even if these impact are not immediately felt.
He observed that many more rural communities have stories to share with the larger public and voices to amplify and saw Web 2.0 tools as perfect applications to project the voices of the rural poor in the future.

“How do we solve the problem of rural connectivity in order to extend the benefits of Web2.0 tools much wider beyond the scope of the cities?” he asked.
He pointed out that it was important to have knowledge of video editing and innovativeness in order to create story telling videos.

It was interesting to note that vlogs provided an alternative media for presenting an otherwise long stories or presentations in very simple and attractive manner.
Deh explained that short videos have an added advantage in view of the visual component that helps promote deeper understanding of stories, especially for people with less educational background and or for people who do not have a reading habit. He added that they also had the power to reach an unlimited audience with minimal cost as the vlogging process does not require specialised expertise and can be easily controlled by any non-technical person.

“It is technologically simpler and cheaper to maintain than a website. Videos are interactive medium often encouraging readers to comment” said Deh.
The experience of GINKS in using video blog and the importance of the tool in promoting Information and knowledge sharing has informed the network to extend the use of the tool to capture some of the interesting experiences from a two year Research project the network undertook with a community in Rural Ghana in collaboration with the International Development and Research Center (IDRC).

HBG Upgrades Digital Asset Management System with TeleScope Enterprise Solution

HBG USA (Hachette Book Group), one of the largest trade publishers in the U.S., has announced that it will upgrade its digital asset management (DAM) system with a solution powered by North Plains Systems Corp., a leading provider of digital asset management software.

The company plans to use the system called TeleScope Enterprise to manage rich media assets, such as Quark and Adobe InDesign files, for its base publishing business, and to integrate other content, such as audio and video, over time.

“We are streamlining workflow related to managing digital content, enabling our team to focus on creating and selling our content without extraneous processes impeding their work. In addition, as we develop new business ventures, TeleScope Enterprise will be an excellent resource, allowing us and our business partners to take full advantage of our digital content,” said Beth Ford, Chief Operating Officer of Hachette Book Group USA.

Ms. Ford added that the upgrade of HBG’s digital asset management system is one initiative in its ongoing digital strategy. This strategy includes investments in a state-of-the-art redesign of HBG’s website to be launched early next year, and its OpenBook program, which leverages technology to create online book widgets, offering readers a simple way to showcase their favorite book titles on their blogs and social networking sites without jeopardizing the copyrights of the authors.

“North Plains has a long legacy of success in providing rich media management solutions for businesses in the publishing industry,” said Hassan Kotob, President & CEO of North Plains Systems. “We’re extremely pleased that Hachette Book Group USA has selected TeleScope Enterprise to help streamline their publishing workflows, reduce overall costs, and provide foundational support for the growth of their business.”

The industry’s most advanced, feature-rich, scalable digital asset management platform, TeleScope Enterprise serves as the central repository for an organization’s digital media content, metadata and the business logic that turns content into currency. Leading organizations today rely on the TeleScope solution to provide an end-to-end digital media supply chain that supports the creation, integration and repurposing of rich media assets.

Founded in 1994, North Plains Systems Corp. is the leading provider of digital asset management solutions. Its pioneering technology, focus and vision have been recognized throughout the industry and are evident in such innovative products such as TeleScope Enterprise and TeleScope Video Manager. Utilizing North Plains’ platform agnostic approach to managing digital assets, more than 450 customers benefit from North Plains’ expertise in delivering industry defining solutions for video asset management, digital media management and distribution, centralization, workflow optimization, and virtual collaboration.

About IBM, Web 2.0 and Microsoft. Part I

January 28, 2008 IBM's Lotus Development Corp. unit has shifted its integration, social software and unified communications story into high gear as it prepares for a Web 2.0 scuffle that likely will dwarf its past e-mail clashes against Microsoft Corp.

The Web 2.0 battle will encompass many foes beyond Microsoft, including Cisco Systems Inc., and traditional telephony vendors and online giants such as Google Inc. and Yahoo Inc. It will also produce product and vendor options that are sure to test the strategic investment skills of IT executives who told Network World in 2007 that they view collaboration technologies as "important" or "somewhat important" to their future productivity goals.

At its annual Lotusphere show, IBM hammered away at the way it will integrate its product portfolio that includes messaging, real-time communication, and new social software and rapid application-development tools.

But compared with past editions of the conference, in which Lotus seemed to be steering the course of collaboration evolution, the company now seems to be playing from behind in many areas, including messaging, Web conferencing, unified communications and software-as-a-service, while Microsoft, Cisco and others are grabbing headlines.

But IBM has its gems as well.

The company's move last year into social software with Lotus Connections and this year's expansion of the platform give it perhaps the strongest set of tools built for corporate users in comparison with those of competitors that are working with adaptations of consumer products.

In addition, the delivery with Notes 8 of the company's open-client framework built on Lotus Expeditor and Eclipse, a container for executing XML-based application components, provides client integration. This is designed for users who want to buy and run only the components they need, dictate the pace of their adoption, and retain options to fill in any gaps with homegrown software.

In addition to Notes 8, the framework is the front end for Sametime 8 and Lotus Symphony productivity applications. It will eventually front every back-end server and service so users can get functionality a la carte while maintaining a single interface.

In addition, IBM said that integrating those same servers with other clients, such as Microsoft Outlook and partner software such as Carestream Health Inc.'s imaging tools, won't lock users into the Lotus platform and will expand its range of potential sales.

Lotus last week also announced partnerships with SocialText Inc. and Atlassian Software Systems Pty. to integrate wiki technology from each vendor into Lotus Connections.

"The most beneficial part for customers is the integration," said Dwight Davis, an analyst at Ovum. "It's the fact I don't know that I am using Quickr [content management]; it's just a plug-in to my client."

Zambians fail to communicate on New Year's Day

By Brenda Zulu

Many Zambian Celtel subscribers failed to communicate to anyone on the 2008 New year's day. This failure was also extended to MTN and Zamtel subscribers who wanted to communicate with people on the Celtel Network.

Emelda Yumbe Coordinator of the Zambia Media Women Association(ZAMWA) said that she failed to communicate Celtel subscribers. Yumbe who is an MTN and Zamtel subscriber said she failed to communicate with people on the Celtel Newtork.

Shalom Investments in Mandevu who are retailers of Celtel Mobile Top Up White Nyerenda complained that he had lost business on New Years Day indicating that the money was trapped in the Mobile Top Up facility which was not operational on Christmas day.

He said many of his customers were annoyed because they were unable to top up and make phone calls.

Sally Chiwama a Celtel subscriber said she was very disappointed and upset because failure to communicate was not happening for the first time.She called on celtel to improve the services.

Mary Sathula a Celtel subscriber asked Celtel to be considerate especially during times such as the New years even when many people are known to have been using their facility. "Celtel should be considerate and they should improve their services," said Sathula.

Charles Kachabe another Celtel subscriber said he was very inconvenienced as he really needed to communicate and that this led to failure of some urgent business. He called on Celtel to apologies to the Celtel subscribers as they were robbed of a precious moment to even wish their friends an families a Happy New Year!

Meanwhile, Celtel management has apologized to it’s subscribers for network interruption during the festival season that resulted into clients failing to generated calls and short messages (SMS).

Celtel acting public relations manager Patricia Litiya said in a statement in Lusaka yesterday, the network interruption was as a result of micro-link failure in Mbabala area in Choma District on December 31. 2007 and was rectified on January 1, 2008

Ms Litiya said engineers could not access the site due to heavy rain hence the delay in resolving the problem and that areas affected by the micro-link failure were the entire Southern and Western Province.

She indicated that in Lusaka, Celtel experienced equipment failure on one of its switches and this affected the network performance in Lusaka, Eastern and Central Provinces. The effects could still be felt in some parts of the country until late in the evening of January 1, 2008.

Litiya said Celtel Zambia did not everything possible to ensure network availability and that capacity was adequate to cater for expected traffic during the festival season.
She said generally calls and SMSs were sustained until January 1 when they experienced the equipment failure.

Thursday, October 30, 2008

Web 2.0 to increase online learning

By Brenda Zulu
Online learning is here to stay and the increasingly availability of Web 2.0 tools will making e-learning experience more rewarding though some challenges exist.

Making a case for online training of Journalists in Africa, Kwami Ahiabenu II from the International Institute for ICT Journalism (PenPlusBytes) observed that online training was going to become dominant means of training and that there was need to invest time and energy to ensure that it becomes part of a capacity building projects and programmes application.

He noted that though there are costs associated with online training in the long run online training was relatively cheaper and cost effective. An achievement was that online learning provided an opportunity for Journalists to learn about new tools and use them in the process of learning

In overcoming the challenges, Ahiabenu II explained that the course used a group e-mail list as their primary course delivery tool in order to ensure low bandwidth participants are not left out of the course.

Meanwhile, to over come issues of time and commitment, they encouraged the participants to devote more time to the online learning experience in order to derive maximum benefits. They had also developed strategies about coping with change and developing “online” mindset by adapting the course to the learner’s environment via flexibility.

“For example when participants could not participate in a session because his or her internet was down, we modify our time table to take this problem into consideration,” said Ahiabenu.

He explained that the tools of the online course were skype, blogging, yahoo groups, google groups, wiki’s, flicker, like del.icio.us, digg.com, flicker, youtube and myspace to allow for participants to comment on content of the web.

In order to facilitate group communication in real time he explained that they asked participants to create a skype account the unfortunate part was that this was not successful because skype was banned in some of the participants’ countries.

He said the three months online training organised with the help of partners’ course involved research, Web2.0 tools and Knowledge Management for newsrooms.

Some of the online topics included chat forums, reporting ICTs and Content Management System for Journalists.


“Our lecture notes designed for a quick read, straight to the point and written in a narrative format. At the end of each lecture notes reference are provided as well as mandatory further online reading. Links to additional relevant online resources are also provided,” he said.

In additional to online content, the trainers encourage participants to buy relevant books as well by providing them with a list of books. They also pointed out to participants relevant articles in magazines and newspapers as they are publish during the course.

About IBM, Web 2.0 and Microsoft. Part II

In the Carestream example, radiologists don't know their instant messaging and voice capabilities are Sametime; they just see new functionality in a familiar application.

It's a message that IBM will build on, Davis said. He said the company needs to accelerate the discussion away from the "product-centric fire-hose flow of information" and direct it to "looking at customer goals and showing [Lotus] can address them with integrated products."

Showing the breadth of its tools and integration across its software is key if Lotus wants to stand out from the pack.

The setup has begun and Lotus Connections is a prime example.

"The social software is where they can claim they are first with an enterprise comprehensive solution, and that is why they are pounding at it, " said Mark Levitt, an analyst at Framingham, Mass.-based market research company IDC. "You are seeing a lot more integration with things like Quickr, Sametime, Symphony."

Experts said the integration story is being driven in part by Microsoft's success with Sharepoint Server 2007, which is one of the foundation elements of Microsoft's collaboration and real-time communications strategy, as well as an entry point into social networking tools such as blogs and wikis.

"SharePoint Server 2007 is sweeping through the industry like no other software product that I've seen since the early heydays of Lotus Notes," wrote Burton Group Inc. analyst Mike Gotta.

"IBM is taking it on the chin right now."

He said that IBM should use its arguably superior social networking tools to switch the focus away from Sharepoint.

"IBM can use Connections to compete with Microsoft by changing the focus to social computing rather than collaboration and content," Gotta said. "IBM has to do superior and native integration between Connections and Microsoft productivity tools and integrate with SharePoint as well."

Later this year, Lotus will add replication to Connections, a feature that was always described as the crown jewel of Notes.

But Lotus's challenges are a multiheaded monster.

The introduction at Lotusphere of Foundations, appliances bundled with software to support small and midsize business, and Bluehouse, a set of services available over the Internet, shows that Lotus is playing catch-up to Microsoft and its strategy of software plus services, which includes Microsoft Small Business Server coupled with Windows Live services.

Also, the company has to find a way to make Sametime stand out. This year, it will release two new versions, including a telephony version slated to ship in the second half of 2008. However, both have a list of features and functionalities comparable to those of the other players including Cisco and Microsoft, which are quickly rising to the top of the unified communications discussion. In addition, some new partnerships with Cisco and Nortel Networks Corp. unveiled at Lotusphere are identical to those Microsoft has made.

"This has become a very competitive space," IDC's Levitt said. "And IBM is looking for places where it can claim leads."

Replication essentials

Replication is the copy and distribution of data and database objects from one database to another with the goal of maintaining synchronization between the two databases.

The origin database is call publisher and the destination database the subscriber. Each database has an agent (SQL Server agent) responsible of capturing the change from publisher and applying the change onto the subscriber.

In a distribution implementation you will have to choose between a pull subscription and a push subscription. In a push subscription, publisher agent carries the load of moving data over to the subscriber, in a pull subscription the subscriber agent actually is in charge of reading data from the publisher.

Sql server allows implementation of 3 methods of replications:

Snapshot replication

In a snapshot replication format, a copy of the publisher is send to the subscriber from time to time; the exact picture of data or database object will be copy in bulk over to the subscriber.

This method is recommended only if the destination database functionalities requirements include a reasonable degree of latency, the database receiving data will be in delay synchronization with the origin database.

Snapshot replication is the most commonly implemented method of replication; it is a good fit for low bandwidth network or low activity network.

Transactional replication

The term transactional here refers to the fact that SQL Server captures and applies changes using DML operations.

All committed changes are called transaction; the difference with snapshot replication here is the fact that changes are send to the subscriber as they occur.
It is possible to setup the publisher to cache changes and send them at a certain frequency or transmit all changes in real time. During the time interval transaction are stored in log files before transmission to the subscriber.

The initial step in a transactional replication is a snapshot replication where a copy of the publisher is send to the subscriber.

Transactional replication required a high bandwidth and reliable network connection; in case of delay in transmission, transactions are cached in transaction log before transmission. Depending of the size of data to replicate, transaction log can grow quickly.

Transaction replication is recommended for data ware house and reporting databases where less transaction occurs and it is recommended to disable replication before massive load.

Merge replication

Merge replication is designed for systems where servers must have the same data at defined intervals. Think of this as a both ways differential backup where changes from all databases must be mutually applied; data are move both ways.

This method of replication obviously present risk of conflicts and data inconsistencies; it is the most difficult to implement and manage. Implementation includes creation of additional systems table and required adding new column to identify server to replicated tables.

Replication agent is here call merge agent and is equally responsible for carrying the replication workload.

Update for SQL Server 2008

In SQL Server 2008, you can also synchronize databases by using Microsoft Sync Framework and Sync Services for ADO.NET

REAL-TIME TEXT AIMS TO IMPROVE INTERNET ACCESSIBILITY FOR THE DEAF

By Brenda Zulu
The blossoming of multimedia content on the Internet in recent years has
revolutionised personal interactions, business communications, and
other online services. But for millions of Internet users with sensory
disabilities, many of the communication tools remain frustratingly out
of their reach.

In a press release, Mr Arnoud van Wijk, Disability Projects Coordinator for the Internet
Society (ISOC), who was born deaf, knows only too well the frustration
Internet users with a disability experience from many current Internet
services.

"During the past few years, the use of the Internet as a modern
replacement for telephony has accelerated," said Mr van Wijk. "The
ability to include more media in calls provides an excellent
opportunity to include people with disabilities in online
conversational services. But too often discriminatory voice telephony
services are simply re-created."

With this motivation, Mr van Wijk and other researchers have
documented a technique for "real-time text"; combining existing
Internet Engineering Task Force (IETF) standards to enable text
streaming over Internet Protocol networks.

The technique uses Internet telephony protocols to ensure
compatibility with voice, video, and other multimedia services on the
Internet. It allows text to be sent and received on a character by
character basis, with each character sent and displayed immediately
once typed, giving text the same conversational character as voice
communication.

According to Mr van Wijk, "Internet Telephony is rapidly becoming a
major way of staying in touch. But it breaks the traditional text
telephone, which deaf and hard of hearing people used in the past to
call each other. The real-time text technique addresses this problem
and can be integrated with Internet telephony."

Along with fellow technologist Guido Gybels, Director of New
Technologies at RNID (UK), and with contributions from other experts
in communication and accessibility for people with disabilities, Mr
van Wijk edited and co-authored 'Framework for Real-Time Text over IP
Using the Session Initiation Protocol (SIP)', which the IETF has just
published as an informational document in its 'Request for Comment'
series as RFC 5194.

To further progress work in this field, this week sees the launch of
the 'Real-Time Text task force' (R3TF), an informal forum for
engineers, motivated individuals, experts, companies and
organisations. The R3TF has received incubation support from ISOC, as
part of its "Enabling Access" initiative, under which ISOC promotes a
diverse range of projects aimed at breaking down the barriers to
Internet access.

Michael Burks, Chairman, and Cynthia Waddell, Vice Chairman of ISOC's
Disability & Special Needs Chapter, welcome the announcement of the
new task force.

"Accessibility for persons with disabilities is critical and must be
maintained in the coming convergence," said Ms Waddell, an
Accessibility Expert to the International Telecommunication Union
(ITU), who is hard of hearing herself. "But it is worth pointing out
that, like many disability projects, this effort has the potential to
provide more options and greater usability for all users in many
situations."

Wednesday, October 29, 2008

How to gain storage and then performance from data type

During the design phase of a database the choice of the proper data type can have a sizable impact on data base size then on overall performance.

1- Datetime

You may want to choose smalldatetime over datetime for the following reasons:
smalldatetime data type uses half the space (4 bytes) of datetime data type(8 bytes)
datetime stores up to the milliseconds and smalldatetime stores up to the minute,
Use smalldatetime if you need to store date from January 1, 1900 to June 6, 2079; datetime stores date from January 1, 1753, to December 31, 9999

2- Integer

Most of the time developer chose bigint over int as data type for identity key on transaction detail table.

bigint is probably the single most misused data type.
Here are some reasons why to use int over bigint:
Int uses half the space (4 bytes) of bigint (8 bytes); in a 100 million row table you could save about 400MB worth of space by choosing int instead of bigint.
Int can store values from -2147483648 through 2.147.483.647 and bigint can store values from -9223372036854775808 through 9223372036854775807; a huge number.

So it’s clear you should go for bigint only if you need to store values over 2.1 billion; most systems will not grow to this number.

3- numeric/money

It is highly advised to choose money type for currency instead of numeric or decimal. The default precision (number to the left of the decimal point) for both numeric and decimal is 18 not far to the maximum of 28 digits; the default scale (number to the right of the decimal point) is 0.

More than often developers leave untouched the default precision value of 18, higher that the maximum of 15 digits for the money data type. Money data type size is 8 bytes with a precision of 15 and scale of 4 digits while decimal and numeric data type use from 2 to 17 bytes.

4- Character, Text

It is recommended to use char data type over varchar data type in very precise circumstances: data in column must have the same size and contain no null values. Char data type size on disk is fixed, the space for null values remain allocated and smaller string occupied the defined size. varchar data type size on disk is variable and string data of n characters is stored in n bytes, giving a better handling of data size on disk.

Unicode data types nchar and nvarchar must be chosen carefully as they use twice as much space as the non-Unicode data type char and varchar. While char and varchar size is up to 8000 bytes nchar and nvarchar store only up to 4000 bytes. This means you will potentially store the same data on twice the space up to half the capacity.

It is highly advised to use Unicode data type only if Unicode data will be stored in the system. Please note that the proper database collation must be selected to match the Unicode set to avoid string manipulation mal function that may occur.


How is all this space saving impact the overall performance?

A good choice of data type will allow data to be stored in smaller space in the database, this will account for smaller result set and better performance in queries.

Deadlock in OpenXML When Called Twice Within the Same Transaction in each of two Concurrent Instances of a Stored Procedure

THE PROBLEM

We encountered a deadlock in a stored procedure that occurs when two instances of the stored procedure run at the same time in different threads. The stored procedure itself is a fairly simple operation that takes an XML document as a parameter and uses OpenXML to extract two hierarchical levels of content from the XML document. The procedure performs the following tasks in order:

  1. Opens a transaction;
  2. Extracts Order information from the XML document using OpenXml and INSERTS a row into the Order table;
  3. Uses SCOPE_IDENTITY to retrieve the new Primary Key from the insert as @OrderID;
  4. Extracts Order Item information from the XML document in a second OpenXML call and INSERTS 0 or more rows into the OrderItem table.
  5. Commits the transaction.

The procedure deadlocks with another instance of itself in Step 4 if two calls are made at the same time from two different threads (i.e. the calling Web Service receives two orders from two users at the same time.)

Using SQL Server Profiler, we have observed that both procedures make it past step 3 and into step 4. There is a foreign key constraint between Order.OrderID (PK) and OrderLine.OrderID (FK). The execution plan suggests that a Clustered Index INSERT on the child table and a Clustered Index SEEK on the parent table at the same time. By design, SQL Server takes a shared lock on the Order table and an exclusive lock on the OrderItem table, not exactly the recipe for a deadlock situation. Yet there it was, time after time after time.

THE SOLUTION

The SQL statement that deadlocked was a standard INSERT INTO dbo.OrderItem (OrderID, (columns)) SELECT (@orderID, (columnList)) FROM OpenXML WITH (ColumnList) where @OrderID is the new OrderID from Step 3 and (ColumnList) is the result set of order items from the XML document. To solve the problem, we moved access to the OpenXml document outside of the transaction and stored its result set in a table variable. We did this for both OpenXml accesses (i.e. Step 2 and Step 4). The new step order for the procedure became:

  1. Create two table variables: @OrderTable (OrderColumns) and @OrderItem (OrderItemColumns)
  2. INSERT INTO @OrderTable SELECT ((orderColumns) FROM OpenXML WITH (OrderColumns)
  3. INSERT INTO @OrderItemTable SELECT ((orderItemColumns) FROM OpenXML WITH (OrderItemColumns)
  4. BEGIN TRANSACTION
  5. INSERT INTO dbo.Order(orderColumns) SELECT (orderColumns) FROM @OrderTable
  6. Use SCOPE_IDENTITY to retrieve the new Primary Key from the insert as @OrderID
  7. INSERT INTO dbo.OrderItem(@OrderID, orderItemColumns) SELECT (orderItemColumns) FROM @OrderItemTable
  8. COMMIT TRANSACTION.

The OpenXML calls were moved outside of the Transaction scope. This limited the scope of any shared locks on each OpenXml instance to the queries that accessed them.

So WHY Did This Fix the Problem?

We're not quite sure. A search of Microsoft Technet did not reveal any articles that described the specific problem. However, it feels as if the two concurrent procedure calls, each running in its own process and accessing its own instance of OpenXML twice from within its own transaction, created a thread deadlock inside the OpenXML component.

In closing, we add that while OpenXML is still supported, Microsoft now considers XQuery to be the preferred method to do what has traditionally been done using OpenXML. We'll have more on XQuery as a replacement for OpenXML in a future post. Stay Tuned.

DAY ZERO fever

By Brenda Zulu

I asked people about what they learnt on Day Zero Web 2.0 fordev conference and what they where going to take home.

Prince Deh
GINKS Assistant Network Coordinator
Ghana


I am planning to host local language videos and translate the text in English because I realise that many people get my stories and information from the blog. Video blogging has enhanced my knowledge and sharing skills.
As you see people are attracted by video and they want to see and hear at the same time. I have learnt how to use delicious, wikis and tagging. One thing is that if you don’t tag your work not so many people will read about it.


William Eziniwa Nwangwu
Africa Regional Center for Information Science, Lecture
Nigeria


I have been wondering why Web 2.0? Isn’t it a new word for an old thing?
I have discovered how it is being used and sometimes it worries me as an academician. Some Universities in the USA have banned students giving reference to wikis as one of their sources of researched information. In wikis who is the author? Is the information peer reviewed? In my institution I limit the reference of wiki copies.


Makelesi Gonelevu, Anju Mangal - Secretariat of the Pacific Community (SPC), Fiji


We learnt about the potential of RSS feeds, wikis, tagging, blogging and the most interesting of all was the farmer blogging where farmers shared agricultural knowledge amongst themselves and also between experts. In the Pacific, we at SPC are trying to get farmers to access online agricultural information and have expert’s answers queries from the farmers.
Web 2.0 will enable us to implement the various ways of information sharing and collaboration between farmers and stakeholders. In the Pacific, we have to deal with accessibility issues and one of the main issues that we face is connectivity. Web 2.0 is an amazing tool; however, it may not work in the Pacific if we have accessibility issues. Low bandwidth is an issue and using vlogging can be a problem in terms of accessing online videos. We have discovered a lot on day zero and we hope to fully utilise Web 2.0

www.spc.int/lrd - SPC Land Resources Division website.

Mirjam Schaap, Wageningen International
I learnt about real relevant use of Web 2.0 tools for grassroots people (farmers, traders etc), the potential of bridging gaps between farmers / researchers. It was also interesting to have the impression that some people are concerned about a threat of use of Web 2.0 tools by ‘amateurs’ for the ‘professionals` (concerns about taxonomies, about traditional video professionals etc). I also learnt that some of the names I have come across online, actually are real people …, that they also exist outside cyberspace … Very nice to meet people in real life.
I loved the ‘online presentation’ and desktop sharing from Wageningen, and I got to know some handy tools which are used by others (desktop sharing, good cheap hosts, wordpress plugins etc.)
I really liked the energy during the day, and the willingness of everyone to share experiences.
I am still looking for tips on how to infect my colleagues with the Web 2.0 virus, how to convince my friends to share info using Web 2.0 tools, how to convince my colleagues to not just use text to communicate but also use video and audio. So on how to mainstream Web 2.0 tools use in my organisation and among our partners.
But we’ll have another 3 days of working together, and sharing and learning.

The Future for Africa is mobile- Media

By Brenda Zulu
The future for Africa is Mobile as it has been embraced by more than 200 million people on the continent..

Africa Interactive, the publishers of Africa News www.africanews.com a world wide interactive multimedia platform focused on Africa are piloting a new project called Voice of Africa where journalists use mobile phones to send news video clips to report news.

Elles Van Gelder said the Digital Citizen Indaba (DCI) said the project was launched by the Dutch who said Western media does not represent does not represent Africa and set up the project to show more balanced images of Africa.

She explained that sending video clips using the mobile phone was a new way of creating content. She said journalists who are part of the project are trained to become innovative reporters and how to use the cell phones.

Elles explained that they also looked at the technical side and provided the journalists with small keyboards because the cell phone keys where too small to enable Journalists do their work fast.

She observed that the media focus was on Africa and that this was a revolution as these Journalists will be reporting live in events such as elections in Kenya.

Peter Verweij of the University of Utrecht Netherlands observed that mobile technology brings Journalists back to the streets meaning one does not need to get back to the newsroom to send a news report.

With the GPS facility, editors in the newsrooms will also be in a position to supervise their reporters because they will be able to know where the reporters are and what they are doing.

Verweij said mobile phones will enhance journalists to report from anywhere for web pages and blogs. The content can range from text to Video and noted that for the first time anyone could be a reporter.

He also observed the challenges for Africa as being the level of internet connectivity as the work of Journalists is set to improve dramatically with innovations in mobile GPS technology.

In the same vein, Ndesajo Macha a Sub Saharan Global Voices editor in delivering his key note address at the DCI said the future was mobile. He said text messaging has been delivering news.

He said SMS was also used for social networking as much of the news now is known through SMS before the mainstrem media makes the reports.

The coming of new technologies thus has led to fear of adapting to new ways of doing Journalism.

Capturing the essence of convergence, Arrie Rossouw the editorial Director of Media 24 said there was need for people to stop talking about cries and insecurity and instead strive to move toward integrated newsrooms.

The discussion on convergence noted that in African news rooms remains largely unrecorded. Some newsrooms are marching forward, pod casting news items and music programmes and sending texts to cell phones, others are experimenting with video, sending sports clips and news to wireless services.

Matthew Buckland, the Mail and Guardian Online Manager told delegates attending the Highway Africa Conference 2007 that the Web 2.0 software was an important development for smaller role players.

Buckland noted that Media companies need to develop strategies for using the web 2.0 software for social networking and also to attract advertising.

He pointed out that the web 2.0 has played an important role in the improvement of technology and is also less expansive.

Web 2.0 is a term often applied to perceive ongoing transition of the World Wide Web from a collection of web sites to a full fledged computing platform servicing web application.

Tuesday, October 28, 2008

ZAMPOST “Stay Connected”

By Brenda Zulu

The government has urged the nation of Zambia to stay connected by utilizing modern postal services that are earmarked for lunch soon.

Minister of Communications and Transport Dora Siliya in a speech on the launch of the 2008 World Post Day reminded the nation of the vital link to people’s lives that Zampost plays especially in rural areas.

Siliya said the World Post Day Celebrations theme was “Stay Connected”; to emphasis the pivotal role that communication plays in the social, economic and structural development of the world economy.

“We live in a society where access to information lies at the heart of most human activity. Information about goods and services is increasingly driving global business, and the Postal systems finds itself providing that most important link to the global market as long as we stay connected,” said Siliya.

She observed that some Postal enterprises have realized the importance of the need to reform by using new technologies and have embarked on implementing measures designed to improve the quality of service and to expand the product and service portfolio.

She informed Zambians that Zamposts was implementing the theme for this years and had partnered with the Zambia Telecommunications corporation limited (ZAMTEL) by installing the Wide Area Network (WAN) to link all Posts Offices throughout the country. She said implementation of the project was at an estimated cost of USD 700 000 and that it had already stated and was expected to be completed by the end of this year.

Siliya observed that the Wide Area Network will bring many benefits not only to Zamposts but to the general populance through improved connectivity, wider internet access, automation of counter operations, operational efficiency through better financial monitoring and control, cost reductions and new business opportunities.

She said Zampost was acquiring more computers to ensure that counter services in all Post Offices wre efficient and effective.
“This initiative is expected to greatly improve service delivery and will minimize the long queues we are accustomed to see at post offices,” said Saliya.

She added that Zampost was taking full advantage of WAN comprising of 119 fully fledged post offices, 46 sub post offices and 58 postal agencies to expand its product set with new value added products and services such as more tarck and trace facilities, hybrid mail, post shops, automated teller machines, point of sale devices, banking services and Electronic Post (e-post).

She said the services combine both hardcopy and electronic express mail delivery and that customers will be able to send messages via e mail for physical delivery to post boxes and physical address.

She said her ministry was mandated to ensure that information and communication technology related services such as telecommunications and internet services were available to all the Zambian people including rural areas.

She observed that in this regard Post Offices have been identified as key institutions which will play a major role in achieving that national vision 2030.

“Despite predications of their demise not so long ago, postal services are alive and well and now more relevant than ever. E-mail and the Internet have not replaced them, but have interested created new opportunities. We live at a time where the different means of communication complement each other. The postal sector provides a precious service, not only to anyone who buys or sells online, but also to people not yet able to make full use of the new technologies and for whom a the mail is a vital and inter-regional trade, the development gap between postal services around the world needs to be narrowed so that businesses and individuals can benefit more from them,” said Edouard Dayan, Director general of the UPU.

Digital Asset Management Tools Run-Down

Digital Asset Management This is en example of ineffective managing your assets, a way of cataloguing. To make it successful you can use this general run-down of available cataloguing tools, or Digital Asset Management. The field is crowded and the product you choose needs to be tailored to your needs and budget. In rough order of power, features and cost:

iphoto - Comes bundled with the Apple iLife suite. The 08 version can archive thousands of pics and has basic editing functions. Category: You're Gonna Need a Bigger Boat Cost $80 * you also get iMovie, iWeb, GarageBand and iDVD. Cant beat it for the price but like I say: "If you have an $80 photo library, get an $80 tool to manage it.

MS Expression - The borg takeover of this software leaves me feeling very uneasy but it was a great product as iView Media Pro. Category: Nervously pessimistic Cost: $300 As is typical of microsoft, there are about 50 different versions to choose from. Best of luck.

Extensis Portfolio - Good brand name, scalable product to allow all levels to enjoy. RAW support and a 30 day free trial. Category: Too Good to be True? Cost:$200

MediaDex - The Cumulus single user product. Dont know a thing about it but Cumulus is a very powerful DAM thats not limited to photos (or you might say, not specializing in photos). Category: Dark Horse for Starters. Cost: $80

Cumulus: A real heavy hitter, probably overkill for all but the largest companies. Category: Grandaddy. Cost: Not Sure

Aperture 2: What a laugher V.1 of this was. Dont we expect better from Apple? Well, how about this? Not only offering powerful image cataloging but also a mid range adjustment function (Photoshop lite?). Maybe trying to do too much and no server option. Anyone try it yet? Category: Getting There? There? Anyone? Cost: $200

Montala - I say Montala, you say HUH?!. How about I then say 'free"? Yeah, this may just be the future but youre going to need a code monkey to start it up (easy) and maintain it. But you can also add in functionality to customize it. All web based, so theres no install and you can use it on any machine. Category: The Future is Now Cost: You Heard Me

AS is evident there are plenty of solutions out there at really reasonable costs. If you have specific experience with any of the above (and indeed any I may have omitted) please post in the comments. There is a lot to learn here and I think for the photographer of any level its only second to having a proper back up strategy.

Geof Bowie
www.thinkfixed.com

ISOC FELLOWSHIPS TO THE IETF BUILD TECHNICAL LEADERSHIP

The Internet
Society (ISOC) has announced the names of those selected as Fellows to
attend the next two Internet Engineering Task Force (IETF) meetings.

As part of its long tradition of helping build technical capacity,
ISOC provides a Fellowship program that enables technologists from
developing regions to attend the IETF, while also pairing them with an
experienced mentor to integrate their participation rapidly.

This fellowship round attracted 70 applications resulting in nine
fellowships awarded. The Fellows come from nine different countries,
including Ethiopia, Pakistan, Fiji, Tuvalu, Congo, Chile, Costa Rica,
India, and Venezuela. Four Fellows will attend IETF 72 in Dublin,
Ireland, from 27 July - 1 August 2008, and five Fellows will attend
IETF 73, in Minneapolis, USA, from 16 - 21 November 2008.

"The ISOC Fellowships to the IETF are a key part of our work to help
build technical leadership and participation in less developed
countries," says Karen Rose, ISOC's Director of Education and
Programs. "The next billion users of the Internet will come
predominantly from the developing world. The Fellowship program helps
ensure that technical experts in these regions have the knowledge and
experience needed to more fully participate in global Internet
standards development."

"ISOC is very pleased to acknowledge Afilias, Google, Intel, and
Microsoft for their investment," notes Drew Dvorshak, Senior Manager
for Organisation Members. "The Fellowship is an important opportunity
for business leaders to benefit from ISOC's global resources by
funding a unique and effective effort to develop the next generation
of technologists. We are actively seeking additional Fellowship
sponsors as the potential for this program is enormous and a key part
of enabling the emergence of 'the next billion' users."

The selected ISOC IETF Fellows are:

IETF 72, Dublin, Ireland

* Tamrat Bayle, Ph.D. (Ethiopia) is an Assistant Professor at the
College of Telecommunications & Information Technology, where he has
been using IETF protocols in his varied research projects.

* Hugo Salgado (Chile) is an application developer at NIC Chile (.CL)
and is interested in Domain Name System Security Extensions and IPv6
issues after having previously followed the Cross Registry Information
Service Protocol mailing lists.

* Alejandro Acosta (Venezuela) is an Internetworking Coordinator for
British Telecom. He has been following the IETF Discussion list for
many years and is interested in the IPv6 Maintenance and TCP
Maintenance & Minor Extensions working groups.

* Kumar Saurabh (India) is currently a Senior Technical Leader at
Sonus Networks. He has specific interest in the Session Initiation
Protocol working group and had contributed to the Media Gateway
Control working group for over four years.


IETF 73, Minneapolis, USA

* Jean Philemon Kissangou (Congo) is currently employed by DRTVnet and
has been following the IETF IPv6 working groups discussions for some
time.

* Hamid Mukhtar (Pakistan) is a graduate student at Ajou University
(South Korea) and has co-authored an Internet Draft for the IPv6 over
Low Power WPAN working group and also follows the Mobility Extensions
for IPv6 working group mailing list.

* Terry Rupeni (Fiji) is the Network Analyst at the University of the
South Pacific where his work is closely aligned with the IETF working
groups in the Internet and Routing areas.

* Tenanoia Veronica Simona (Tuvalu) is employed by Tuvalu
Telecommunication Corporation as an IT Manager. Her interests include
the IP over Cable Data Network, Mobility Extensions for IPv6, and IP
Telephony working groups.

* Ing. Carlos A. Watson Carazo (Costa Rica) is interested in Domain
Name System Operations, Multicast Security, and IPv6 Maintenance
working groups as they directly impact his work at NIC Costa Rica (.CR).


ISOC is very pleased to acknowledge the corporate sponsorship from
Afilias, Google, Intel, and Microsoft in support of these IETF
Fellowships.

Deploy SSRS Reports in SharePoint Integration Mode

This blog addresses deployment of SSRS Reports and Report Models to a SharePoint 3.0 Site. This information can be gleaned from the following MS TechNet article, Deploying Models and Shared Data Sources to a SharePoint Site. The intent is to (hopefully) make the necessary deployment settings for SharePoint Integration Mode more explicit and straightforward.

Assumptions:


  • Report Server has been configured for SharePoint Integration Mode.
  • SharePoint Report Server Integration Feature is activated.
  • Appropriate Content Types (Report Builder, Report Builder Report, Report Data Source) have been added to a Document Library within a site.


For further information on these configuration topics, see Deploying SSRS with SharePoint Integration.

Assume the following server/site/library/folder names:

  • Server Name = MyServer
  • SP Site = SiteABCReports
  • Doc Library = SSRS Reports
  • Data Sources Folder (optional) = Data Sources
  • Report Models Folder (optional) = Models
  • Reports Folder (optional) = Reports


Deploy Report Designer Reports

From within a SQL Server Business Intelligence Studio (BIDS) Report Server Project, go to the project’s Property Page (right-click project in Solution Explorer and select Properties).
Apply the following deployment settings:

  • Overwrite ExistingDataSources = True
  • TargetDataSourceFolder = http://MyServer/SSRS%20Reports/[Data%20Sources]
  • TargetReportFolder = http://MyServer/SSRS%20Reports/[Reports]
  • TargetServerURL =http://MyServer

Where [‘xxx’] denotes optional.

Things to note:


  • A TargetReportFolder must be specified. The folder can be the document library or a folder within the document library.
  • The TargetDataSourceFolder is optional. If one is not specified, the Data Source will be deployed to the TargetReportFolder.
  • For both target properties, if the folder within the document library does not already exist, it will be created upon deployment.
  • Relative paths are not valid.
  • The replacement of the space character with %20

Deploy Report Models

From within a SQL Server Business Intelligence Studio (BIDS) Report Model Project, go to the project’s Property Page (right-click project in Solution Explorer and select Properties).
Apply the following deployment settings:


  • OverwriteExistingDataSources = True
  • TargetDataSourceFolder = http://MyServer/SSRS%20Reports/[Data%20Sources]
  • TargetModelFolder = http://MyServer/SSRS%20Reports/[Models]
  • TargetServerURL = http://MyServer

Note: The TargetModelFolder and TargetDataSourceFolder properties must be set to the document library or folders within the document library.

Throughout this blog, where a folder within a document library is specified, this folder is optional. Folders at this level merely serve to organize the different types of report support files.

Monday, October 27, 2008

Using data label expressions to create SSRS 2005 Pie Charts with intuitive and helpful pie slice

by Joe Toscano, Senior Software Engineer, RDA Corporation

If you’ve created PIE charts in SQL Server Reporting Services you’ve noticed that you are able to drag and drop category, series and data fields. The setup is pretty straightforward, but by working a bit with the data label values you are able to display more than a single column value in each slice of the pie. This may be helpful in allowing your reports more contain more useful information.


In our example we are looking at the total sales for bike products from the AdventureWorks database. We are looking at 3 categories: Touring Bikes, Road Bikes and Mountain Bikes. Each bike category has both a Total Sales and a Percentage of Total Sales that is part of the reports data set. Most pie chart reports display a single value present in each slice of the pie; however, in our report we are displaying both values. Below is a snapshot of this sample report:

















How did we improve our pie slice labels?

This blog entry focuses on how we are able to display both the total sales and the percentage of total in each pie slice. The answer lies in modifying the data label for the Total Sales column that we dropped to the Data Field area while in Report Layout / Design Mode.

The screen snapshot below was taken from the Report Design / Report Layout mode. Notice that we drug the LineItemTotal column to the Data Field area. Once there, we right-clicked on the data label and chose properties.









Let’s focus on the Point Labels tab. Notice the Data Label dropdown and the little “Fx” to its right. This tells us that this value can be much more than a simple column in your dataset. The Data Label value can actually be the results of an expression! In our case, we used the several built-in functions to convert data types of more than a single column, strip out trailing spaces and concatenate formatted strings. Below is the expression that was used in our Data label:

cstr(left(Fields!LineItemTotal.Value / 1000000, 5)) & vbcrlf & " (" & cstr(left(Fields!PercentageOfTotal.Value,5)) & "%)"

In summary, by working with the Data Label expressions, your pie slices can contain much more information that a single column value. The only limit is really the size of each pie slice, but as was done with our report you are able to specify that the slice values be placed on the outside of the pie.

Telestream and Nativ Helping Getty Images. Part I

Telestream and Nativ came together to help Getty Images get a handle on its ever-growing library of digital video.

Getty Images creates and distributes the world’s broadest image collection, making stock images available to customers for use in news, sports, entertainment, and archiving. Getty was founded in 1995 with the goal of modernizing the stock photography industry, and was the first company to license imagery via the web. Today, gettyimages.com serves an average of 4 million unique users in addition to an average of 175 million page views each month, delivering nearly 100% of the company’s visual content digitally.

Business Challenge

Getty Images’ Media Management Services (MMS) group offers solutions for customers in the media and corporate markets to enable users to store, manage, and share their digital assets in a secure, searchable, password-protected environment. Getty Images' MMS group serves more than 190 clients worldwide, including American Film Institute, BSkyB, Discovery Communications, and General Motors. A growing number of these customers were requesting support of video assets in addition to their still image collections and turned to Getty for an integrated solution. While Getty’s MMS solution offered rudimentary support for video, the MMS team knew a more sophisticated interface would be required as video usage by their customers continued to climb.

Nativ is a UK-based consulting, technology, and outsourcing company specializing in the design, delivery, and support of video-centric products and services. Founded in 2001, the company delivers videocentric strategy and technology solutions for clients including MTV Live, Sony, and BBC News. Their flagship product is Mio, a media management and workflow platform designed to automate the process of ingesting, validating, cataloging, re-purposing, and distributing digital media across any platform.

Telestream provides encoding-based media workflow solutions that allow customers like A&E Television Networks and Google to conveniently transform digital media on the desktop and across the enterprise. Founded in 1998, Telestream offers plug-in components, craft encoding applications, and workflow automation applications that enable content owners, creators, service providers, and enterprises to streamline their operations.

The Problem In Depth

Getty Images’ MMS team provides its corporate and media customers hosted solutions for managing and distributing digital image assets through two services, Media Manager and Image.net. Media Manager is a digital asset management service targeted at creative and marketing teams, enabling web-based access from across the enterprise to a company’s marketing materials to ensure consistency, security, and efficiency. Image.net is a subscription service used by PR departments to promote digital publicity and marketing materials to the media.

According to Tim Claxton, senior product manager for the MMS group, “MMS clients use our services to manage images licensed through Getty.com, but also to organize and share their own images. We noticed that clients had growing amounts of video as part of their digital assets; it’s being used increasingly in their marketing campaigns.”

While MMS had the capability to handle video assets, its functionality was limited. “In the old system we could store video, but it had no intelligence,” says Claxton. “There was no thumbnail view, no streaming preview, and no metadata, so it didn’t give users browsing the system a good video search experience.” With the expectation that MMS customers would continue to enlarge their digital video asset base, Getty knew it had to get serious about upgrading support for video in both Media Manager and Image.net.

Claxton says that Getty very quickly turned to Nativ to help them design their solution. “We had a connection with Nativ because our development team is based in London and knew them already. There are just not a lot of consultants out there with specialist knowledge in video solutions, so it was a good fit.”

Nativ consultant Jon Folland says that the consulting company recognized that distributors like Getty would need help facing video distribution challenges that had traditionally been the purview of broadcasters. “Getty was reaching a tipping point. They had been using the internet as a distribution mode, but increasingly for their customers, the internet was becoming a viewing channel.”

Read Case Study Part II

Using SSIS to Transfer Data from a SSAS Cube

Scenario
A process needs to be built that will take the aggregate data from your SSAS cube and transfer that data as a source for a third party application. This process will need to be scheduled and have the capability to execute when requested.

Solution
Using SSIS to transfer the data is a great tool to meet the requirements above since you can build the process and schedule it as required as well as run the package when requested.

Since the tool has been established, the question now becomes how to get SSIS to connect to the cube and then work with the results.

The following document lays out a step by step approach to get the initial connection to the SSAS cube, sample MDX to query the cube and the cleanup steps within SSIS to perform against the data to have it usable going forward.

Data Connection
The data connector to setup would be an ADO.NET connection object to the SSAS database. Using OLEDB connector runs into repeated warning signs and I have experienced will actually not retrieve the data and instead either error out or just hang altogether.

However, for the same connection credentials, the ADO.NET connection will connect to the SSAS cube and allow for retrieval of the data.

Data Source
The DataReader data source object is used to pull the data from the SSAS cube using the ADO.NET connection. The property within the DataReader to note is the SQL Command on the Component Properties tab. The SQL Command will be a MDX query to retrieve the data.

A sample MDX query to retrieve data from two dimensions (product and region) across a measure group could be the following:
SELECT [MEASURES].[RETAIL SALES AMOUNT] ON COLUMNS,
NONEMPTY(CROSSJOIN ([DIM PRODUCT].[PRODUCT CATEGORY].CHILDREN, [DIM REGION].[REGION NAME].CHILDREN)) ON ROWS
FROM [SSAS CUBE]

As a result of this MDX query, when you look at the Metadata of the DataReader, you will see the following names for each of the columns being returned:
[[MEASURES].[RETAIL SALES AMOUNT]]
[[DIM PRODUCT].[PRODUCT CATEGORY].[PRODUCT CATEGORY].[MEMBER CAPTION]]
[[DIM REGION].[REGION NAME].[REGION NAME].[MEMBER CAPTION]]

Also, each column will be of type DT_NTEXT, which can be an issue when the third party tool is a destination table and you start receiving data type matching errors.

Clean Up Process
To solve the Data Type issue, two methods can be used.


One can use a Data Conversion transformation to change the data type. However, the data will not convert from DT_NTEXT to DT_STR or DT_NUMERIC as you would hope. Instead, convert the field to a DT_WSTR field for each of the fields and then you can add another conversion transformation to translate from DT_WSTR to DT_STR or DT_NUMERIC.

The problem with this method is that nine columns are created from the original three since each column is converted and a different output name is given to the conversion column. Unfortunately, the Data Conversion task will not replace the original column field.

The second method involves using the Copy Column to get an appropriately named field and then use the Derived Column task to perform the CAST in one step from DT_NTEXT to DT_WSTR to DT_STR, which would look similar to the following:
(DT_STR, 50, 1252)(DT_WSTR, 50)[Region Name] where Region Name is the Copy Column output name you gave in the previous Copy Column step.

The question then becomes, why is the Copy Column needed? Unfortunately, the Derived Column task will not take the [[DIM REGION].[REGION NAME].[REGION NAME].[MEMBER CAPTION]] output name as part of the expression, so using the Copy Column transformation allows for creating a new column with a name that the Derived Column transformation can handle. This still leads to doubling up the column list but you are saving creating one extra group of columns to work with.

Conclusion
At this point, the package has successfully extracted the data from the cube, assigned an appropriate name for each of the columns and setup a correct data type for each field in the result. The package can now process further with the data as stated within the requirement.

Digital Asset Management Buzz Revolution

HONG KONG & PHUKET, Thailand & BARODA, India (Business Wire) Buzz Media (Buzz Technologies Inc.) brings together the talents of experienced professionals in the Information Technologies, entertainment and legal fields. Its Dontpirateme.com (www.dontpirateme.com) delivers not only piracy protection but also revenue and exposure maximization online, with one-of-a-kind proprietary software and systems, to make the most out of your digital assets.

Digital Asset Acquisition & Protection

The acquisition & protection of digital assets such as URL's, images, multi-media and performances.

Digital Asset Optimization

Have your content everywhere it should be and easily accessible by your fan base.

Maximize Online Royalties

With an avalanche of royalty paying content, whilst Buzz Media services focuses on negotiating the impact of illegal content with our own proprietary content tracker.

Managing Copyright in a Digital Era

The world has changed dramatically in the last 10 years, it is impossible for any individual to review copyright and deal with all the content from other people about your asset online, Buzz Media has built it's own software to deal with this issue. If there is a photo or material online that has not been properly secured, our software will pick it up and copyright it, on your behalf.

Social Media Marketing

Myspace, Facebook, Google Open Social, Facebook apps, with an audience in the billions, the revenue from this market cannot be ignored. By working with Buzz Media you will be able to maximize your return from this new phenomena.

Content Creation

Websites, fan sites, social media content, multi media content, per pay download, pay per view, mobile content. Buzz Media is a master of getting these services online and maximizing the revenue potentially. This can be a paid service or a revenue share deal.

Digital Battlefield

Unfortunately, lawyers will take a very lengthy and expensive approach litigating illegal on-line content. At Don't Pirate Me, we will take a very direct and speedy approach to protect your digital assets. Crime to most is an insurmountable issue; Don't Pirate Me however, resides in that cyber-world and has developed highly effective means to neutering these offenders.

Internet Famous

Whether it is up and coming stars or established household names, it is essential to have the ability to reign in the monster that the Internet has become, malicious content is rampant and only serves as an obstacle to your services. Through a combination of flood optimization and Internet wizardly we can bury the undesirable content for you and harness the power of the World Wide Web.

Maximum Revenue for Content

Buzz Media can deliver your content to the new market in a simple cost effective way keeping one step ahead of the pirates.

Watchdog

Deploy your own Internet watchdog. Our propriety software system that scans text, images, video and all illegal download forms, reports with a monthly summary from as little as $500 per month.

The Equalizer

Problems with Pirates? Bury them in an avalanche of legal content. Fill their servers with electronic cease and desist orders. Negate their presence on the web. Starting from $1500 per month.

The Drummer

Need someone to beat your drum? Buzz Media services will not miss a beat. You will be everywhere you need to be with every cent of potential online revenue being captured or exposure starting from $1500 per month.

The Geek

Content creation for mobile downloads, download bundles, subscribers, on-demand services, paid per-view services, search engine optimization, flash content, every geek and nerdy service imaginable at unbeatable prices.

About Buzz:

Buzz Technologies, Inc. is a convergent media company with operations ranging from infrastructure development to online retail.

The foregoing press release contains forward-looking statements based on the Company's beliefs as well as assumptions made by and information currently available to the Company, including statements regarding the timing of the introduction of certain products. These forward-looking statements are based largely on the Company's expectations and are subject to a number of risks and uncertainties which are identified and described in the Company's registration statements and periodic reports on file with the SEC, some of which are beyond the Company's control. Actual results could differ materially from these forward-looking statements as a result of a variety of factors including, among others, issues related to the travel and transportation industries, and prevailing economic conditions in general. In light of these risks and uncertainties, or should underlying assumptions prove incorrect, there can be no assurance that the forward-looking

Sunday, October 26, 2008

How to Find the Worst-Performing Queries in a Database Server

A common complaint among database users is the sporadic occurrence of errors like this:

Timeout expired. The timeout period elapsed
prior to completion of the operation
or the server is not responding.
Exception type: SqlException
Source: .Net SqlClient Data Provider

The errors may come from a variety of procedures or queries. While occasionally the query itself may need to be optimized, the problem may stem from another unrelated operation that is bogging the system down. At that point the problem becomes a general tuning issue that may involve queries that have nothing to do with the “offending” query.

There is a simple way to gain insight into which query or queries deserve looking at. Using SQL Profiler, you can develop a list of queries, sorted by average reads descending. This result set will become your “hit list” of query optimizations that will yield the greatest return on time invested.

WHY QUERIES DO EXCESSIVE READS

(1) There is no index available to support a seek-based lookup. This may occur when a query has no WHERE CLAUSE. Your best defense here is to avoid this situation altogether. If you must query without a WHERE CLAUSE, keep the result set very narrow and select it as infrequently as possible.

(2) There is no index that supports the WHERE CLAUSE. A WHERE CLAUSE across joined data (i.e. multiple filter expressions on different source tables) may be particularly problematic, as it is usually not possible for the query optimizer to figure out which index to use on the clause. You may improve performance in this case by joining two subqueries, each on its own table with its own WHERE CLAUSE.

(3) The WHERE CLAUSE is applied to a view. Avoid using views in queries that will be subsequently filtered. Selecting even a single row, based on its primary key, from a view will cause a scan of the underlying table.

Note that INSERT and UPDATE queries are not immune to these problems, as many of these use filters and embedded joins to do their work.

HOW TO USE SQL PROFILER TO FIND THE WORST OFFENDERS

All queries in a database can be optimized, and optimizing most queries is a waste of time and effort. This simple procedure will enable you to identify the worst offenders in your server and target your efforts.

(1) Start SQL Profiler. Open a new trace on the server that you wish to observe.

(2) In the Trace Properties screen (Events Selection tab), check “Show All Events.” Find the “Stored Procedures” events and choose the “SP:StmtCompleted” event. This will enable queries within a stored procedure to be separately reported.

(3) The default set of Column Filters will work well, so no changes are needed there. You can, however, keep the trace volume down by filtering reads for value >= 100. Queries with reads < 100 are not of interest to us in this context.

(4) In the Trace Properties screen (General tab), check “Save To Table.” Connect to the database where the trace will be saved (probably on your own computer so that you can keep it as evidence), specify a database, an owner, and a table. The database and owner should exist. The table should NOT exist. Press RUN to activate the trace. This will cause the table to be created (or overwritten if it already exists).

(5) As the trace runs, put the server through its paces. The idea is to create the environment where the problem manifests itself. You can concurrently use Management Studio to query the status of the trace table. Assuming you called the table Trace1, this query will get you started.

SELECT TOP 10 CAST(TextData AS Varchar(255))AS Query
,Count(*) AS TimesCalled
, AVG(READS) AS MeanReads
FROM [dbo].[Trace1] WITH (NOLOCK)
GROUP BY CAST(TextData AS Varchar(255))
ORDER BY MEANREADS DESC

The result set will contain three columns. The Query column will produce recognizable SQL statements from within stored procedures. The TimesCalled column will reveal the prevalence of the call. And the MeanReads column will reveal the average reads done by the query each time it is called. This result set will produce powerful evidence of where the problems are in a database server, enabling you to clean up your own mess(es) before you escalate the issue within the department.

(6) As you repeat the test, use a different table name each time. This will allow you to chart progress as you fix stored procedures.

Celtel Subscribers demand for an improved Service

By Brenda Zulu
Celtel subscribers would like to see the Mobile Service Provider improve its services especially in the rural areas and also support local artists who are not established become known as they launch a new advertising campaign.
Celtel consumers made a comment to the Celtel advertisement of the campaign in which Celtel international announced a simultaneous launch of its new advertising campaign in 14 countries of operation. The campaign aims at bringing the values and beliefs of Africa closer through one common thread: Music.
A Celtel subscriber based in Mongu acknowledged their presence in the 72 districts adding that as Zambia is part of the information society it was also a human right for the people of Zambia to communicate.
Mukela Liwena said Celtel needs to empower local musician based in the rural districts since noticing that they have only sponsored already known musicians.
“There is so much talent in the rural areas and Celtel should localize the Celtel messages in Zambia’s local language using their own traditional instruments. There is the Chikuni Musical Festival which is the coming together of musicians based in the Southern Province through their songs and he called on Celtel to take advantage of the festival, “said Liwena.
In terms of posters, Liwena asked Celtel to use the faces that are known to Zambia.
The new campaign dubbed “Rock Your World” gives a fresh and interactive feels to the Celtel brand; while preserving the company’s devotion to Africa by highlighting the richness, diversity and warmth of the continent and it’s people. Through this campaign Celtel will locally, regionally and internationally set out to find and promote the best musical talent that exists in Africa.
Meanwhile, some Celtel consumers are wondering why Celtel is particular about service branding and not service provision. The new campaign should be synonymous with the service. What is good is a Mobile Service glamorous outlook if there is no better service? asked Reginald Ntomba.
Rose Kalwani sees the campaign as an opportunity for Celtel to improve the service observing that “many times when am with my mother in the same room and I try to call her I will be told that am out of coverage area which is not true. She pointed out that these were issues which Celtel should try and correct as soon as possible.
Gershom Musonda complained of the mobile top up s facility which he said was most of the time its off. Musonda observed that Celtel was also offloading cheap phones.
“Imagine a handset fully connected at 100pin only. They also used to charge us for checking the balance until they were challenged by other providers. Not long they sold us weak phone models like Nokia 3310 which dropped in price from K800,000. to K100 000. within 5 yrs. This was dumping of the worst order,” he said.

The original and creative advertising campaign follows Celtel’s initial branding that painted the continent red and gave birth to the popular slogan “making Life better” almost three years ago in January 2004. Since the company’s new face the brand has worked hard to empower Africans while creating a common synergy throughout the continent. From the warm sunshine filled beaches of east Africa to the rough waters of West Africa, Celtel has joined the lives of Africans like no other brand to date.
Speaking about the new advertising campaign, Zain Group Chief Officer, Tito Alai commented, “with this new phase of our brand roll-out we have taken a fresh start jump and committed ourselves to promoting and investing in music across the continent. As Africans music is our life blood, it’s in our veins and souls â€" it brings us together while living our everyday lives. Weather its contemporary jams blasting out of cars in the bustling city or soul stirring rhythm of traditional music in the village, music is core to African lives.”
Alai observed that the company tagline “making life better” continues to stand behind the Celtel brand independent of the phases we enter throughout its life cycle. Ultimately “Celtel strive to deliver the means for people to be better connected, to communicate , to stay closer in touch to share emotions , to get more done , to transact business and to celebrate successes, “ said Alai.
“As a company we also go beyond impacting only our individual consumers, we strive to achieve synergies across the African continent by creating jobs, enhancing skills and developing infrastructures. We are already involved extensively in sponsoring various music initiatives. “
Celtel International connects over 24 million subscribers covering over 40% of the African population. The group’s strategic aim is to become the leading Pan African mobile communications group providing innovative and intergraded communications services across the continent.

Telestream and Nativ Helping Getty Images. Part II

The Solution

The Nativ team worked with Getty to develop system requirements and identify potential vendors. The recommendation that came forward was for Nativ to develop a customized solution called the Video Asset Pipeline Processor (VAPP). VAPP would draw on Nativ’s existing Mio Core System for cataloging, validating, filtering, and repurposing video content, and on Telestream’s FlipFactory for transcoding media and automating video workflows.

According to Folland, “Mio works as a video analysis tool by telling both the end user and Getty everything they need to know about the file being uploaded, from file format to pixel aspect ratio to language.” The system also serves an important gateway function by validating whether the files being uploaded are in an acceptable format before kicking off the upload process.

As for the choice of vendor for the transcoding component, Claxton says, “Telestream is recognized as a market leader in transcoding, and we were already familiar with the FlipFactory product.” One of the features of FlipFactory that Getty and Nativ particularly liked was its ability to support a broad range of file formats. Says Claxton, “We needed something that spanned as broad a range of file types as possible” because MMS customers need to choose the format in which they store their video assets.

Starting in February 2007, Getty’s London development team worked closely with the Nativ consultants on developing VAPP and integrating Mio, while Nativ and Telestream collaborated on the FlipFactory implementation customized for Getty’s needs. George Boath, Telestream’s European regional manager, says that “the Software Developer Toolkit for FlipFactory provides a lot of power and flexibility to use predefined code, customized code, or a combination of the two.” He recalls that the Getty implementation used mostly predefined code but Nativ did some customization to address Getty’s requirements. “The Nativ team got up to speed on FlipFactory very quickly,” Boath recalls. “It helped that we have a support team in London who could work closely with them.”

Claxton and Boath both mention that the project required Getty to appreciate the difference in complexity between video and still image processing. “With most of our customers, 25-50% of the project issues that our technical guys are called in to resolve are on storage capacity and network infrastructure,” says Telestream’s Boath. “Oftentimes the IT staff doesn’t anticipate the load that video processing will put on their infrastructure.”

It took about three months for the FlipFactory component of the Getty solution to be completed, and about six months total for the VAPP solution comprising Nativ and Telestream platforms to be finalized, in time for an early August 2007 rollout. Claxton says that Getty will continue to use Nativ in a consulting capacity to tweak and improve the video management solution in future upgrades.

The Outcome

While it is still too early to gauge customer response to Getty Images’ new video management capabilities, the solution powered by Nativ and Telestream offers some marked advantages over the old system. Claxton points to the presence of thumbnail screens and automated metadata extraction as two key advantages that make it easier for end users to browse their corporate video assets.

Another improvement is the availability of secure streaming previews of videos. “Our customers are concerned about their video image asset security; they want to provide a low-resolution streaming preview but prevent downloads of high resolution versions of the file,” says Claxton. Implementation of Akamai’s streaming preview technology as part of their solution addresses both the discovery and security concerns of Getty’s customers. “Using Akamai also took some of the burden off of our technical environment to support that video streaming.”

Nativ’s Folland points out that Getty’s MMS model provides key benefits for its customers, who may be addressing video asset management in earnest for the first time. “Media asset management solutions have historically been very expensive. But under Getty’s ‘software as service’ model, customers pay for only the video management they need. They don’t have to buy, install, or host software because MMS services are entirely web-based. And they offer a massive breadth of support for video formats, which have yet to be standardized.”

The ability of Getty Images to offer a robust video processing system to its MMS clients is a major competitive advantage. “Customers think of video as just another asset type, the next step in digital assets, but it’s about ten times as complicated to store and manage,” Claxton says, looking at Getty’s own experience. “Images are in standardized formats, but video can span a huge range of formats and codecs. By offering video support, we shield our customers from the complexity of doing it themselves.”

Read Case Study Part I

Web2fordev wayforward Vox pops

By Brenda Zulu
I asked people what is the way forward and what they were going to take back home.

Wycliffe Ochieng Arua- Agriculture Commodity Exchange.Kenya
It has been a great conference and the way forward especially if there is a way we could work together to blend the existing web 2 tools that are used in Africa putting in mind our users farmers they would like to access these application.I am looking at mobile telephone and interative voice resposnses an dthe local FM stations in Kenya.
I have taken home a lot of knowledge and I am very much informed now about blogs and wikis and would like to certain up a blog in our organisations especially when we are contributing to a proposal.
I think we used existing network to reach the people CSOs are working with on the grassroot level.

Alec Singh- Chief Technologist, ACP Secretariat.Brussels
The way forward for me on a personal basis is to take time to delve further into some of the tools that we have come across over the past few days. At the end of the day, the main theme is using the web to encourage participation in development issues. Naturally, there are other means of involving participants in the development process, such as phone-ins over the local radios, etc. We need to make sure that we do not only see the technology as an end in itself, but the means to that end - which is the development of billions of humans throughout the world.

We have seen that the technology changes very rapidly, and no doubt, what is "hip" today, will be "passé" in a couple years time. Thus, we have to be willing to adapt, not just for the sake of adapting, but to communicate more effectively.
There is certainly an issue of information overload and maybe trying to get all the various tools and platforms out there to talk to each other, so that they do not duplicate what has already been said on another platform.

Among the lessons I will take home is that I need to spread the word to my other colleagues working in the field of development, that they should try and use more of these tools to communicate further and more effectively.

Louise Clark,NR International
The next step should be to looking out for practical applications in their own location. It is great to know and important for us to be aware of the new technologies. There is however a challenge because I have never heard of Web2.0 but I was using the tools. Our experince in Bolivia is the we do not know the difference between Web 1.0 and Web 2.0 otherwise we have been uploading files to the web using Web 2.0 applications.
It was good to about these tools and also to hear about other people’s experinces.

Ethan Zuckerman, Global Voices
I think these international gatherings are good although I feel that we need regional gatherings. There is so much innovation on the African continent and it would be nice to have regional stories. It is expensive to come to Rome were people are torn between seeing the Collossoe and attending the conference. As we have come the end everyone says we are going to do it, I feel there is need to stay in touch with a small group.

Anup Kumar Das,Center for studies in Science policy

I would like to develop some particular blogs related to our idea of activities and will create tags, rss feeds. Already personally, I am using skype and students will introduce learning models on skype so that users may use skype on forth.
On blogging we would like to get feed back from our Policy Makers on policy related issues and other learning tools we can use in our center.

Tim Kulchyski,Hul’qumi’ num
I think to start with there were a number of things that focus on language but also there is need for more time to have interaction of face to face on particular subject areas on what people are doing.
I saw different skills and I was particulary impressed with Ethan Zukerman’s presentation as he said we can benefit from links which i think are effective in blogs as when other people link you the more one can find you using search engine.

Enrico Bertacchini, Creative Commons
I think that this conference is more or less about what people in reality should join. They have to start networking using the tools and that this conference suggests. It is a natural revolution. All the people are willing to use and adopt these tools.
I am taking home the idea that Web2.0 is an emerging concept that many organisations need to embracy in their synergies. A road that we need is trust since these are workable tools. 80% success of these tools is up to human beings to share ideas.

Saturday, October 25, 2008

Passing .NET data types in Parameters to Stored Procedure

Structured Query Language is very forgiving in its automatic type conversion, sometimes too much so. Recently I encountered a stored procedure, written by a co-worker, that passed a GUID cast as a string to a varchar(255) stored procedure parameter, which was then directly inserted into a table that expected a uniqueidentifier (SQL for GUID). While the team quickly agreed this sort of type coercion was something best avoided, they also asked me for a mapping guide for .NET data types to SQL Server data types. This blog post is the answer to that request.
Generally speaking, most .NET data types have a counterpart in the SQL world and vice versa, although there are some differences and caveats. For example, there is no Unsigned integer in SQL, and SQL DateTimes have different range limitations from their .NET counterparts. More on that in a little while.
Here is the list of .NET data types and their SQL counterparts. This list is adapted from SQL Server documentation and is presented in a .NET-centric way. It is not necessarily inclusive of all defined SQL data types; rather the intention is to provide a data type for each .NET data type.

  • Boolean: pass to a bit. Note that SQL Server stores up to 8 bits within a given table row into each byte of data on the disk. So storing 8 boolean flags has the same storage cost as a single byte field.
  • Byte[]: pass to a binary(n) if the size is always the same or a varbinary(n) if the size will vary and not exceed 8000 bytes. For Byte[] buffers > 8000 bytes, pass to a varbinary(max).
  • byte: pass to a TinyInt.
  • Byte[1]: pass to a binary(1) or varbinary(1).
  • char, char[1]: pass to a nchar(1) or nvarchar(1).
  • char[], string: pass to a nchar(n) if the size is always the same or a nvarchar(n) if the size will vary and not exceed 8000 bytes. For char[] strings > 8000 bytes long, pass to a nvarchar(max).
  • datetime: Pass to a DateTime or a SmallDateTime, depending on the precision needed or the time range to be stored. The DateTime stores Date and time data from January 1, 1753, to December 31, 9999, with an accuracy of one three-hundredth second, or 3.33 milliseconds. Values are rounded to increments of .000, .003, or .007 milliseconds. The SmallDateTime stores Date and time data from January 1, 1900, to December 31, 2079, with an accuracy of one minute. Note that attempting to store the .NET value DateTime.MinValue in an SQL Server database will cause an out-or-range error for both SQL DateTime and SmallDateTime data types.
  • Decimal: can pass to several data types, depending on the need. Using this type presumes a high degree of a priori knowledge about the values to be stored. If this is not the case, consider using a more general-purpose single-to-real or double-to-float data type instead.
    The decimal(p,s) data type allows precision and scale of a decimal number to be specified. The storage size varies depending on the precision and scale values chosen.
    The money data type holds values from (â€"2^63/10000) (â€"922,337,203,685,477.5808) through (2^63/10000)â€"1 (922,337,203,685,477.5807), with accuracy to 1/10000 of a monetary unit. Storage size is 8 bytes.
    The numeric(p,s) data type holds fixed-precision and scale-numeric data from â€"10^38+1 through 10^38â€"1. The p variable specifies precision and can vary between 1 and 38. The s variable specifies scale and can vary between 0 and p. Storage size is 19 bytes.
    The smallmoney data type holds values from (â€"2^31/10000) (â€"214,748.3648) through (2^31/10000)â€"1 (214,748.3647), with accuracy to 1/10000 of a monetary unit. Storage size is 4 bytes.
  • Double: pass to a float (double-precision floating-point number).
  • guid: pass to a uniqueidentifier.
  • Int16: pass to a SmallInt.
  • Int32: pass to a Int.
  • Int64: pass to a Bigint.
  • Single: pass to a real (single-precision floating-point number).
  • Object: pass to an sql_variant.
  • Unsigned: there is no SQL Server support for Unsigned integers, except for Unsigned char, which can be stored as a tinyint. Larger Unsigned values may be stored in binary(n) fields.
    For Unsigned Char or UInt8, use tinyint or binary(1).
    For Unsigned Short or UInt16, use binary(2).
    For Unsigned Int or UInt32, use binary(4).
    For Unsigned Long or UInt64, use binary(8).

For more information, start in SQL Server help at:
ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/denet9/html/89b43ee9-b9ad-4281-a4bf-c7c8d116daa2.htm