Wednesday, November 17, 2010
Automate the Autopopulate document Properties in Cells of Excel Sheet
Here is the tip to automate that
Open Excel sheet --> Press Alt + F11--> Insert New Module --> Paste Below Mentioned function
Module1
--------------------------------------------------------------
Function CusProps(prop As String)
Application.Volatile
On Error GoTo err_value
DocProps = ActiveWorkbook.CustomDocumentProperties _
(prop)
Exit Function
err_value:
DocProps = CVErr(xlErrValue)
End Function
Module2
--------------------------------------------------------------
Function BinProps(prop As String)
Application.Volatile
On Error GoTo err_value
DocProps = ActiveWorkbook.BuildinDocumentProperties _
(prop)
Exit Function
err_value:
DocProps = CVErr(xlErrValue)
End Function
Now add functions in cells
Function to Insert File Name
=MID(CELL("filename",A1),FIND("[",CELL("filename",A1))+1,FIND("]",CELL("filename",A1))-FIND("[",CELL("filename",A1))-1)
Function to Insert the RCSRevision
=CusProps(“RCSRevision”)
Function to Insert Author
=BinProps(“Author”)
Tuesday, September 21, 2010
Encrypted Phone Calls & Skype Security
Sunday, September 19, 2010
We Crib, We Cry but Why? (Part 2)
- 1. The signed printed copy of complaint form submitted above.
- 2. The Postal Order of Rs10/- in favour of (Office of Provident Fund Commissioner, New Delhi)
- 3. Copies of all the communication and responses received by Regional Provident Fund offices.
- 4. A written application to PF Commissioner Template enclosed in this article.
14, Bhikaiji Cama Place,
New Delhi – 110 066.
Saturday, September 18, 2010
Speed up Linux Disk I/O
/dev/hdc
I/O support = 0 (default 16-bit)
Use the following command to test your disk's speed:
- hdparm - Tt /dev/hdc (substitute your drive's name for /dev/hdc);
1. hdparm -c 1 -d 1 /dev/hdc (use your drive's device name):
If the command succeeds, you'll see the message:
/dev/had:
setting 32-bit I/O support flag to 1
setting using_dma to 1 (on)
I/O support = 1 (32-bit)
using_dma = 1 (on)
Try the hdparm -Tt /dev/hdc command to see how much improvement you've obtained. If you're happy with the result, repeat this command for additional drives, if any.
To commit the successful settings, use the same command with the -k option, as in the following example:
hdparm -c 1 -d 1 -k 1 /dev/hdc
Because this command is lost when we reboot Linux, we may wish to put this command into a system initialization script, such as /etc/rc.d/rc.local. If you modify this script, be careful that you don't erase any of the existing code!
I am still searching the setting to enable the 64 bit I/O but hot having a 64 bit edition of linux to experiment with. Once I am able to figure out the setting I will update it . But if you are aware please do update me or post it in comment.
Thursday, September 16, 2010
Grilling the grl’s
Monday, September 13, 2010
Cloud Computing new era in IT Transformation
- First it will strip the encapsulation of data (Since with existing technology we are doing the wrapping of data again and again to make them in compliant with latest technology and unnecessarily we are increasing the data offload if you look at all the communication protocol today it uses 20-50 % additional bytes to ensure the original data are intact and reaches to right part in the system these additional bytes could be encryption, address, CRC or anything. Now it’s a time when we should offload all these burdens from actual data set.
- Now Split the original data set into number of pieces which should go to original storage service providers (varies from 2 to N depending upon the criticality and essence of data).
- Encrypt the splited data set with encryption keys.
- Exchange the Encryption keys and Data Set.
- Store keys and data with service providers.
- Strip the data set from their encapsulation cocoon.
- Split the dataset which should be stored on the database table of service provides based on some logic. (3 or more)
- Encrypt the data sets with encryption keys.
- Exchange the encryption keys.
- Store the keys and data with service providers.
Is really after HIPAA now HITECH act is bothering healthcare professional
Wow! Isn’t? Quite a good bonus. Let’s figure out what is it ?
1. Implement Data Classification Policy Approved and communicated by management.
2. Implement a process to detect any potential data breaches and initiate timely incident response activities.
3. Implement the risk assessment and analysis method to identify the significance of risk (financial, reputational or any other harm of potential breaches to the individuals)
4. Implement notification process.
5. Implement policies, process and procedure to file complaint to ensure compliance.
6. Last but not the least Encrypt data at rest and in transit in any form.
For more details refer Federal register part 2, Department of Health and Human Services
Friday, July 09, 2010
Why things are cyclic in this universe ?
Monday, April 26, 2010
Disabling a Camera and Video recorders from Blackberry Bold
Since from last few days we were struggling to disable the BBB Camera feature. searched lot's of forum and discussion board. Most of forum is talking about using a enterprise service. But what if your organization BB enterprise management team will work like a Gov agencies. When it’s too beurocatic that it will take 3-4 months to have this feature disabled via enterprise server. But still being a good corporate citizen and setting up an example you are interested to have this disabled.
Yes you will get suggestion like put a drop of epoxy on lenses, physically remove the camera lenses etc. But if you are taking such steps your device warranty will void. After doing a small research for three days got and easy and affordable way to disable the camera. Yes use following three command and your device camera and video recorders will be out. But remember if any user is doing the software upgrade by connecting a device on computer then you have to repeat these three steps again.
JAVALOADER -u Erase -f net_rim_bb_camera.cod
After shooting this command your device will reboot. Once it's up and running; shoot following command and then repeat the step for third time.
JAVALOADER -u Erase -f net_rim_bb_videorecorder.cod
JAVALOADER -u Erase -f net_rim_bb_mediarecorder.cod
Now be happy camera and video recorder features are disabling. javaloader is the command you can get it from BB Java Developer Tool.
Thursday, February 25, 2010
10 Steps to Achive Successful DLP Implementation
DLP Implementation is having usually 9-12 process steps. Some of the process steps are sequential and some of them can be completed in parallel. They are as:-
1) Identification of what type of solution do we require? (1-3 Months based on the enterprise size and their partners agreement)
There are many different types of products on the market that promise to solve DLP such as hard drive encryption products or endpoint port control solutions. While they may address one of the ways that data loss can occur they do not address the issue as a Content-aware DLP solution will. Content-aware DLP solutions focus on controlling the content or data itself. Some of them are already in use and some of them are in phase of deployment. (Data @ Motion / Rest / Endpoint or Single Channel or Enterprise Wide etc.)
2) Identify information we are interested to protect. (Usually most expensive and time consuming step in entire deployment varies between 6 Months to 2 Years. The success stores and case studies shows that for R&D and Intellectual property protection it takes 5-8 Months, for financial data protection it takes almost 1-3 years and for healthcare and PII information protection 2-5 years). This step has three sub steps (Identification, Discovery and Classification)
Data Identification
DLP solutions include a number of techniques for identifying confidential or sensitive information (Based on metadata or signature scanning) metadata scanning for enforcement is most common deployment technique. Sometimes people are confused with discovery, data identification is a process by which organizations use a DLP technology to determine what to look for (in motion, at rest, or in use). A DLP solution uses multiple methods for deep content analysis, ranging from keywords, dictionaries, and regular expressions to partial document matching and fingerprinting. The strength of the analysis engine directly correlates to its accuracy. The accuracy of DLP identification is important to lowering/avoiding false positives and negatives. For example data sheets used in HR payroll department, Customer data sheet used by operations, company balance sheets, new business work order agreement procedure document etc.
Data Discovery
Means identify where does the data sleep?
Discovering where sensitive data lives are most important when dealing with unstructured data. If data has structure, then locating that data is only necessary for risk assessment. If the data can be detected using structured patterns on a server, file system, document repository, or other system, then that information can be discovered through a loss vector. However, with unstructured data, the information must be located first so it can be identified when it leaks. One particular challenge is file servers. The use of file servers and shares always starts with the best intentions of keeping things organized, but unless the data users are diligent in placing files in the appropriate shares, it will be difficult to identify which shares need to be protected and which can be ignored. Ideally, each group in an organization will have shares assigned around job functions and data classifications.
Document management systems are less of a challenge since they impose a certain degree of organization on their content by virtue of their structure. Browsing through the structure of a data repository and identifying the administrators for various sections should allow Us quickly to discover what documents are sensitive and which are not. Discovery of data poses lots of political challenges as compare to technical. And while defining the discovery of data following points should be considered.
Understand what is practically achievable. Rather than perfection, aim for what is achievable. Like rather than discovering and classifying every piece of potentially sensitive data, we might focus on high-risk data like credit card information, and customer data.
Involve key players early. Involving key stakeholders early in the process increases the likelihood that they will support it during implementation.
Strictly restrict the metadata removal tools in our devices. As DLP usage is increasing more and more metadata removal tools are popping up in the internet domains. Deploy strict control to restrict the usage of metadata removal tools.
Data Classification
Data classification involve very critical role is the success of DLP implementation. Most organization thinks that defining the classification label is more than enough. But organization should consider the classification tools which will allocate the metadata (also called as Meta tags in all the files used in the organization). Once classified meta tags are observed by DLP Controller it will start executing the preventive and informative policies either deny or allow rules can be defined to reduce the processing load on the system. Organization should identify and appoint the designated document management officer who should be able to address the concern regarding the classification criteria and should vouch the ambiguous documents classifications.
Now some of the members might be interested to know which product is available in market to accomplish above three. With my limited knowledge I have seen the Web sense Data Security solutions are available in the Web sense Data Security Suite* which is comprised of four, fully-integrated modules that can be deployed based on customer need:
Websense Data Discover – Discovers and classifies data distributed throughout the enterprise.
3) Establish Why the Content Needs to Be Protected? (A month time of good discussion required by different stakeholders. Second major involvement of Management group) Here we have to define why the identified data should be protected for ex. Is it for compliance reasons or for protection of Intellectual Property? This could change not only how the content is identified but also how it is reported on. For compliance, we have to ensure that we meet not only the data coverage required, like credit card numbers and other personally identifying information (PII) as required for PCI DSS compliance, but also the reporting requirements for the auditing process. This is going to be a critical step in the success of Our DLP solution, so we need to give it the time it deserves.
4) Identify How Data is Currently Lost ( Again a proper FMEA and Risk analysis give better result consider a month time for this process step too. ) Major involvement would be of technical staff. This will help us determine the type of product to use. Is it through email? Is it being uploaded to websites such as Web email or blog sites? Is it the usage of USB sticks on our endpoints? The most important advice here is not to try to solve all possibilities that we can think of for data loss. We have to remember that what we are trying to stop is the accidental loss of data. If we are trying to stop the deliberate loss of data, then that is significantly more difficult and will quite definitely have a serious impact on our business. If the user is resourceful and knowledgeable enough they will find ways to do it. An audience that many companies forget about is the remote user and the devices they use off-site. People will be more bold and daring if they are not in the office of their organization.
5) Technical DLP Policy Creation (Usually varies from 2-8 Months market research shows that consulting firm like Accenture accomplish this is 8 week time while organization like DuPont and Ranbaxy are able to define all the DLP policies in 16-18 week time frame with their DLP Experts and Partners). This is where we get down to the implementation. Once the solution is installed we now look at how we can create policy that recognizes the actual content we want to control and then how it will be controlled. The above steps that have gone through will help us what should be in the policy and how we can prevent the information from leaking out of our organization.
6) Testing (2 Months of regress testing is sufficient). Like any other IT implementation testing is a major factor for ensuring success
7) Policy Communication A step many miss but in our organization I would consider this step a crucial part for being a successful. Employees need to be brought into the project to guarantee success. It will impact their day-to-day functions, so we need to be certain they understand why these controls are in place and support its use. This can be as simple as explaining why we are implementing such a control and what could happen if we didn’t. Obtain their feedback on the controls and how we might minimise the impact on their work.
8) DLP System Policy Enforcement (2 Months sequential from testing ) Now that we have created the policy, tested it and communicated it, time has come to throw the big switch between just monitoring controls to actively implementing them. Don’t turn them all on at once. Prioritise them and release the most important and critical ones first. Ensure we have plenty of coverage to rectify any issues not found in testing as they arise, as this will impact the employees who are trying to do their job. If we are not helpful or responsive, our employees’ support will vanish.
9) Future Proofing for Organization (Ongoing) We have taken the first steps here, but don’t assume our job is done. Look for better ways of classifying content or where different types of content are saved. When new applications or systems are installed, consider how we can implement them to simplify the DLP controls required. Also continue to pay attention to the evolution of our DLP product. Keep it up to date as there will be newer and better ways of implementing the controls we have in place.
Sunday, January 24, 2010
Absolutely Brilliant Interview Answers of Job Hopper !!!
Some, rather most organizations reject his CV today because he has changed jobs frequently (10 in 14 years). My friend, the ‘job hopper’ (referred here as Mr. JH), does not mind it…. well he does not need to mind it at all. Having worked full-time with 10 employer companies in just 14 years gives Mr. JH the relaxing edge that most of the ‘company loyal’ employees are struggling for today. Today, Mr. JH too is laid off like some other 14-15 year experienced guys – the difference being the latter have just worked in 2-3 organizations in the same number of years. Here are the excerpts of an interview with Mr. JH :
Q : Why have you changed 10 jobs in 14 years?
A : To get financially sound and stable before getting laid off the second time.
Q : So you knew you would be laid off in the year 2009?
A : Well I was laid off first in the year 2002 due to the first global economic slowdown. I had not got a full-time job before January 2003 when the economy started looking up; so I had struggled for almost a year without job and with compromises.
Q : Which number of job was that?
A : That was my third job.
Q : So from Jan 2003 to Jan 2009, in 6 years, you have changed 8 jobs to make the count as 10 jobs in 14 years?
A : I had no other option. In my first 8 years of professional life, I had worked only for 2 organizations thinking that jobs are deserved after lot of hard work and one should stay with an employer company to justify the saying ‘employer loyalty’. But I was an idiot.
Q : Why do you say so?
A : My salary in the first 8 years went up only marginally. I could not save enough and also, I had thought that I had a ‘permanent’ job, so I need not worry about ‘what will I do if I lose my job’. I could never imagine losing a job because of economic slowdown and not because of my performance. That was January 2002.
Q : Can you brief on what happened between January 2003 and 2009.
A : Well, I had learnt my lessons of being ‘company loyal’ and not ‘money earning and saving loyal’. But then you can save enough only when you earn enough. So I shifted my loyalty towards money making and saving – I changed 8 jobs in 6 years assuring all my interviewers about my stability.
Q : So you lied to your interviewers; you had already planned to change the job for which you were being interviewed on a particular day?
A : Yes, you can change jobs only when the market is up and companies are hiring. You tell me – can I get a job now because of the slowdown? No. So one should change jobs for higher salaries only when the market is up because that is the only time when companies hire and can afford the expected salaries.
Q : What have you gained by doing such things?
A : That's the question I was waiting for. In Jan 2003, I had a fixed salary (without variables) of say Rs. X p.a. In January 2009, my salary was 8X. So assuming my salary was Rs.3 lakh p.a. in Jan 2003, my last drawn salary in Jan 2009 was Rs.24 lakh p.a. (without variable). I never bothered about variable as I had no intention to stay for 1 year and go through the appraisal process to wait for the company to give me a hike.
Q : So you decided on your own hike?
A : Yes, in 2003, I could see the slowdown coming again in future like it had happened in 2001-02. Though I was not sure by when the next slowdown would come, I was pretty sure I wanted a ‘debt-free’ life before being laid off again. So I planned my hike targets on a yearly basis without waiting for the year to complete.
Q : So are you debt-free now?
A : Yes, I earned so much by virtue of job changes for money and spent so little that today I have a loan free 2 BR flat (1200 sq.. feet) plus a loan free big car without bothering about any EMIs. I am laid off too but I do not complain at all. If I have laid off companies for money, it is OK if a company lays me off because of lack of money.
Q : Who is complaining?
A : All those guys who are not getting a job to pay their EMIs off are complaining. They had made fun of me saying I am a job hopper and do not have any company loyalty. Now I ask them what they gained by their company loyalty; they too are laid off like me and pass comments to me – why will you bother about us, you are already debt-free. They were still in the bracket of 12-14 lakh p.a. when they were laid off.
Q : What is your advice to professionals?
A : Like Narayan Murthy had said – love your job and not your company because you never know when your company will stop loving you. In the same lines, love yourself and your family needs more than the company's needs. Companies can keep coming and going; family will always remain the same. Make money for yourself first and simultaneously make money for the company, not the other way around.
Q : What is your biggest pain point with companies?
A : When a company does well, its CEO will address the entire company saying, ‘well done guys, it is YOUR company, keep up the hard work, I am with you.” But when the slowdown happens and the company does not do so well, the same CEO will say, “It is MY company and to save the company, I have to take tough decisions including asking people to go.” So think about your financial stability first; when you get laid off, your kids will complain to you and not your boss.
Monday, January 11, 2010
Why Valve replacement give big bucks to Doctors then to Mechanics
the side, waiting for the service manager to come to take a look at his car .
The mechanic shouted across the garage," Hello Doctor!! Please come over here for a minute."
The famous surgeon, a bit surprised, walked over to the mechanic.
The mechanic straightened up, wiped his hands on a rag and asked argumentatively, "So doctor, look at this. I also open hearts, take
valves out, grind 'em, put in new parts, and when I finish this will work as a new one... So how come you get the big money, when you and me
is doing basically the same work? "
The doctor leaned over and whispered to the mechanic.....
.
.
.
.
.
.
.
...
..
..
Doctor said : " Try to do it when the Engine is RUNNING "