1. Bad clusters on a scsi raid 5 drive.

    Date: 04/13/10 (IT Professionals)    Keywords: asp, sql, security, microsoft

    I should know this.

    Checking file system on C:
    The type of the file system is NTFS.

    A disk check has been scheduled.
    Windows will now check the disk.
    Cleaning up minor inconsistencies on the drive.
    Cleaning up 57 unused index entries from index $SII of file 0x9.
    Cleaning up 57 unused index entries from index $SDH of file 0x9.
    Cleaning up 57 unused security descriptors.
    CHKDSK is verifying Usn Journal...
    Usn Journal verification completed.
    CHKDSK is verifying file data (stage 4 of 5)...
    Windows replaced bad clusters in file 87
    of name \mssql\MSSQL$~1\Data\DISTRI~1.MDF.
    Windows replaced bad clusters in file 7220
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\201004~1\TB5CD1~1.BCP.
    Windows replaced bad clusters in file 26077
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\201004~1\TBLPDF~1.BCP.
    Windows replaced bad clusters in file 32542
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\201003~1\TB5CD1~1.BCP.
    Windows replaced bad clusters in file 34123
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\200802~1\TB50D9~1.BCP.
    Windows replaced bad clusters in file 59114
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\200904~1\TB4CD1~1.BCP.
    Windows replaced bad clusters in file 66747
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\200904~1\TBLPDF~1.BCP.
    Windows replaced bad clusters in file 306249
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\200608~1\TB50D9~1.BCP.
    Windows replaced bad clusters in file 313926
    of name \mssql\MSSQL$~1\REPLDATA\unc\INSIGH~1\200608~2\TB50D9~1.BCP.
    File data verification completed.
    CHKDSK is verifying free space (stage 5 of 5)...
    Free space verification is complete.
    The size specified for the log file is too small.

    213371743 KB total disk space.
    137811912 KB in 82347 files.
    42892 KB in 6088 indexes.
    0 KB in bad sectors.
    962587 KB in use by the system.
    23040 KB occupied by the log file.
    74554352 KB available on disk.

    4096 bytes in each allocation unit.
    53342935 total allocation units on disk.
    18638588 allocation units available on disk.



    Windows has finished checking your disk.
    Please wait while your computer restarts.


    For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

    ~~~~

    this is on my domain controller, this is a HP raid5 array consisting of 4 72gb scsi disks. how can you get bad clusters on a raided drive? how can I know which physical drive is failing?

    did I actually lose any data/get any data corruption?

    I have backups of course, the problem if its hardware failure, and I am going to do migration to windows 2008 r2 from windows 2003, it will still take sometime to initiate things, buying a single replacement scsi might be viable but if I can't identify the drive and have to get 4 scsi drives and rebuild the array 1 disk at the time, it would be problematic not to mention prone to disaster.

    Source: http://itprofessionals.livejournal.com/90386.html

  2. SQL injection .NET

    Date: 03/13/13 (Web Development)    Keywords: sql

    The pentesters told use that the following code is vulnerable to SQL injection in our e-store:

    create procedure dbo.uspBeAfraidBeVeryAfraid ( @p1 varchar(64) )
    AS
    SET NOCOUNT ON
    declare @sql varchar(512)
    set @sql = 'select * from ' + @p1
    exec(@sql)
    GO


    How I should fix the issue?

    Source: http://webdev.livejournal.com/583006.html

  3. Converting MS-Access Forms To PHP

    Date: 09/26/11 (PHP Community)    Keywords: php, database, sql, microsoft

    I've got a Microsoft Access application that I'm going to have to convert to PHP and another database (PostGres or SQLite). The db portion I could do by hand if I needed to (the tables are not complicated). It's the forms that I'm worried about.

    Does anyone have any experience with a tool for converting Access forms to PHP code? I found DB Convert, which looks promising, but I'd like to know about anyone's first-hand experiences.

    Source: http://php.livejournal.com/682814.html

  4. ASCII Convert issue

    Date: 04/16/10 (PHP Community)    Keywords: php, database, sql

    I have a person's name with a non-standard ascii character in it (š) and am having trouble converting to back to ascii to retrieve their record.

    The url string contains: mašalaitis
    But PHP reads it as: mašalaitis

    So, my SQL is basically not finding what is stored in the database (former) with what the querystring contains (latter). I'm trying to use ORD() based on the specific value that š outputs (154) and add � to the beginning, and ; after.

    But, this is what it shows when I loop through the whole name:

    109 = m
    97 = a
    197 = �
    161 = �
    97 = a
    108 = l
    97 = a
    105 = i
    116 = t
    105 = i
    115 = s

    Any help is greatly appreciated.

    Source: http://php.livejournal.com/677993.html

  5. Any DBAs in the New York area looking for work?

    Date: 07/12/11 (C Sharp)    Keywords: sql, web

    http://jobsattmp.com/new-york/web-development/dba-sql-server-2005_2008-jobs

    Hey all, I am trying to hire a DBA in NYC that has a background in MS SQL Server 2005/2008. If you're interested or know someone who might be please pass their resume along via the link above. Thanks!

    Source: http://csharp.livejournal.com/108669.html

  6. Bundling changes together.

    Date: 02/21/09 (C Sharp)    Keywords: cms, database, sql, web

    First off, I'll state that I'm working with C# 2008 express edition, SQL Server 2008 Professional edition, and .NET 3.5.

    Second, I'll state that one of the ideas behind this project is to do it without .Net's auto-binding. If it can't be done without auto-binding, it's not going to get done at all, for reasons having to do with extreme customization of the data access code that will be happening way, way down the line. So I need an answer in code, not in designer. Thanks.

    Okay, what I'm doing seems to me to be simple and obvious, but either .Net doesn't agree with me or I'm looking in the wrong place. I have a simple form. It contains a datagridview and three buttons. One button works perfectly--it's the one that closes the form, and we don't need to discuss it here. The other two buttons, however, are giving me fits. They are labelled "save changes" and "cancel".

    The datagridview is bound to a DataTable. This is pretty standard code, I think:

    SqlConnection cnErasmus = new SqlConnection();
                //populate the datatable with the data already in the table.
                tblAuthorType = PopulateDataSet(cnErasmus);
                //Attach DataTable to datagrid.
                dgvAuthorType.DataSource = tblAuthorType;


    (note: it used to be a dataset. I forgot to change the name of the routine.)

    In case you want/need to see the actual filling of the DataTable, I'll put it

            private DataTable PopulateDataSet(SqlConnection cnErasmus)
            {
                string strAuthorTypeSelectQuery = "SELECT AuthType FROM AuthorType";
                using (cnErasmus)
                {
                    using (SqlCommand cmSelectCommand = new SqlCommand(strAuthorTypeSelectQuery, cnErasmus))
                    {
                        CreateConnectionString(cnErasmus);
                        OpenConnection(cnErasmus);
                        using (SqlDataReader theReader = cmSelectCommand.ExecuteReader())
                        {
                            tblAuthorType.Load(theReader);
                            CloseConnection(cnErasmus);
                        }
                    }
                }



    All of this is completely normal. At least, I think it is. Now, the tough part seems to be logging changes. The obvious way to do so, to me (I'm a database guy more than a programmer) is through a transaction object:

    /******************SqlTransaction trnchangeAuthorTypeData = 
           cnErasmus.BeginTransaction("Changes");
                //Must assign both transaction object and connection
                //to Command object for a pending local transaction.
                SqlCommand cmTransactionCommand = cnErasmus.CreateCommand();
                cmTransactionCommand.Connection = cnErasmus;
                cmTransactionCommand.Transaction = trnchangeAuthorTypeData;
                //Whether transaction is committed or rolled back depends
                //on which button the user presses...********************/


    These are the lines just after setting the dgv's data source.

    I've seen this done at least a dozen times in various places on the web, but in every case, the example has some sort of hardcoded INSERT or DELETE statement immediately following, and then a try/catch block with the appropriate transaction.Commit() or transaction.Rollback() statements. The important bit for me is that in every example I've found, all the statements follow one another. It's all very procedural. I want the Commit() or Rollback() to be fired based on which of those buttons gets pressed; the Cancel button will cause a Rollback() and the Save button will cause a Commit(). To me, this seems perfectly logical. My problem is that, unfortunately, trnchangeAuthorTypeData goes out of scope the second we hit the end of that block of code, and so is nowhere to be found when I get to btnSave_Click or btnCancel_Click. Setting it the normal way (trnChancgeAuthorTypeData = new SqlTransaction()) raises an error based on the protection level of SqlTransaction, so I can't declare it the way I declare every other variable. (While no resource I've consulted specifically explains this odd behavior, they all confirm that this is by design.)

    So, after all that, my question is: how do I separate BeginTransaction(), Commit(), and Rollback() into three separate routines? If that's not possible, as I have spent the last twelve hours ascertaining, is there some way other than transactions to make sure that, when the Cancel button is pressed, the system will roll back all changes since either (a) the form was opened or (b) the Save button was last pressed, and that can be done in that manner (with the transaction-analogue starting in the same routine where the datagridview is bound, and finishing in one of the button-press routines)?

    Thanks.

    Source: http://csharp.livejournal.com/101781.html

  7. Newb question about Struts2 & JSP / Tomcat

    Date: 08/01/10 (Apache)    Keywords: mysql, database, sql, jsp, web, linux

    I'm involved in migrating an existing, functional JSP/Struts2 app from Windows to Linux.

    The former Windows environment was a Tomcat/Struts/Eclipse setup. The new environment is a standalone installation of Tomcat (which is already configured and serving several other applications).

    The app in question, "MyJSPWebsite", was copied to the Linux/Tomcat webapps folder and correct permissions assigned. The database (mysql) was also copied over with user permissions established.

    The site now opens, but none of the struts enabled content is functioning. For example, a drop-down list of data is not being populated. I'm not seeing any error SQL messages in catalina.out, and the username/password & query work fine from command line.

    Are there separate, core struts files that have to be installed outside of those already included in the webapps/MyJSPWebsite folder?

    Source: http://apache.livejournal.com/44268.html

  8. Creating a "SQL Job Launch Shell" for lower-priveleged users

    Date: 02/11/13 (SQL Server)    Keywords: sql

    This is in response to my question on 2/4/2013 for SQL Version 2000 (should work in subsequent versions if you follow my comments)

    Design:
    User Table Created w/ Trigger
      CREATE TABLE [dbo].[prod_support_job_queue]  (
        [job_name]     sysname NOT NULL,
        [step_id]      int NOT NULL CONSTRAINT [DF__prod_supp__step___4959E263]  DEFAULT (1),
        [action]       nvarchar(6) NOT NULL,  (Must be either START, CANCEL, or STOP)
        [ntlogin]      nvarchar(32) NULL, --used to log who made the request
        [log_date]     datetime NULL,
        [processed]    char(1) NOT NULL CONSTRAINT [DF_prod_support_job_queue_processed]  DEFAULT ('N')
        )
    ON [PRIMARY]

    CREATE TRIGGER [dbo].[ti_job_queue] on [dbo].[prod_support_job_queue]
    for insert
    as
       set nocount on

       if (
          update(job_name)
       )
       begin
          declare @username varchar(30)
          declare @log_date datetime
          declare @job_name sysname
          
          -- Get the user's attributes.
          select
            @username = loginame
          from
             master..sysprocesses
         where spid = @@spid

        select @log_date = getdate()
        select @job_name = job_name from inserted
        
        update prod_support_job_queue
        set log_date=@log_date,
             ntlogin=@username
        where
             processed ='N'
        and
             job_name=@job_name
    end


    Procedures:

    • check_job_queue - fires off via scheduled SQL job.  It reads from the prod_support_job_queue table
    • make_job_request - procedure exposed to the production support team.  This helps them insert records into the prod_support_job_queue table
    • sp_isJobRunning - (Modified this procedure from THIS publicly available code in order for it to run on SQL 2000 )
    Logic:
    1. The user makes his request via the make_job_request stored procedure.  He is required to enter a valid job name, action (which is either START, STOP, or CANCEL)  
    2. check_job_queue runs every 10 minutes for check for new actions in the prod_support_job_queue table.  It utilizes system stored procedures in msdb to start and stop jobs.  For the CANCEL command, a simple update statement is issued to the processed field to exclude it from further processing checks.
    3. sp_IsJobRunning is called by check_job_queue in order to see if the requested job is already running before issuing any commands

    I am adding fine-tuning to the check_job_queue procedure.  Once that is done, I'll post the code for the two custom procedures check_job_queue and make_job_request

    Source: http://sqlserver.livejournal.com/77452.html

  9. SQL Job Administrators - SQL 2008 R2

    Date: 02/04/13 (SQL Server)    Keywords: asp, sql, microsoft

    I'm thinking about doing this because our number of ad-hoc requests to run jobs has increased to an annoying level.  Does anyone out there have experience putting this into practice?

    How to: Configure a User to Create and Manage SQL Server Agent Jobs
    (SQL Server Management Studio)

    http://msdn.microsoft.com/en-us/library/ms187901%28v=sql.105%29.aspx

    Source: http://sqlserver.livejournal.com/77287.html

  10. Question regarding Collations

    Date: 06/08/12 (SQL Server)    Keywords: database, sql

    Does anyone have any experience on this group in working with Unicode, double-byte case-sensitive data in SQL 2008 R2?

    I would like to select a collation for my database that allows case-sensitive sorting/comparisons with Unicode data that could contain Japanese characters.  Whew...that's hard to say.

    Source: http://sqlserver.livejournal.com/76773.html

  11. SQL Server SP2

    Date: 02/28/12 (SQL Server)    Keywords: sql

    Hi,
    I have multiple instances on SQL Server 2008. We are planning to install SP2 only on one instance. What impact will be for the rest of instances and especially on shared components. Thank you!!!

    Source: http://sqlserver.livejournal.com/76475.html

  12. Query fun and games

    Date: 02/23/12 (SQL Server)    Keywords: xml, sql

    I've found in general for SQL that there is more than one way to solve (almost) any problem. I've been playing around with query building today and decided to see how many ways I could solve a problem that recurs fairly frequently in my work, flattening subrecords into a single row.

    This is my current standard solution, using the PIVOT function. It's quite fast, but limits you to a specific number of subrecords--it can be a high number, but you still have to decide on a maximum.

    WITH cte AS (SELECT Person.contactid AS 'ID' , Person.FullName AS 'Name'
    , 'Activity' = Activity.a422_rel_activityvalueidname
    , 'Row' = ROW_NUMBER() OVER (PARTITION BY Person.contactid, Person.FullName ORDER BY Activity.a422_rel_activityvalueidname)
    FROM Contact AS Person
    INNER JOIN Task AS Activity ON Person.contactid = Activity.regardingobjectid)
    SELECT ID, Name
    , 'Activity1' = [1], 'Activity2' = [2], 'Activity3' = [3], 'Activity4' = [4], 'Activity5' = [5]
    FROM cte
    PIVOT (MAX(cte.Activity) FOR cte.[Row] IN ([1], [2], [3], [4], [5])) AS pvt


    This is a new solution I found in surfing some SQL Server blogs, using FOR XML PATH to create a CSV list of values. It will include an indefinite number of subrecords, but only includes one field from the subrecords. It's significantly slower than the first example by at least an order of ten.
    SELECT DISTINCT p.contactid AS 'ID' , p.FullName AS 'Name'
    , SUBSTRING((SELECT ', ' + Activity.a422_rel_activityvalueidname
    FROM task AS Activity
    WHERE Activity.regardingobjectid = p.contactid
    FOR XML PATH('')), 2, 4000) AS 'Activities'
    FROM Contact AS p
    INNER JOIN Task AS t ON p.contactid = t.regardingobjectid
    ORDER BY p.contactid


    This ugly looking creature is what I used to use before PIVOT came along, using many, many multiple self-joins. I'm pretty sure I had a slightly more elegant (and faster!) version of this, but it's been a long time since I've had to create one of these things (fortunately). The performance is...not as bad as you might expect.
    SELECT 'ID' = p.contactid, 'Name' = p.fullname
    , 'Activity1' = a1.a422_rel_activityvalueidname
    , 'ActivityDate1' = a1.actualend
    , 'Activity2' = a2.a422_rel_activityvalueidname
    , 'ActivityDate2' = a2.actualend
    , 'Activity3' = a3.a422_rel_activityvalueidname
    , 'ActivityDate3' = a3.actualend
    , 'Activity4' = a4.a422_rel_activityvalueidname
    , 'ActivityDate4' = a4.actualend
    , 'Activity5' = a5.a422_rel_activityvalueidname
    , 'ActivityDate5' = a5.actualend
    FROM Contact AS p
    INNER JOIN Task AS a1
    ON p.contactid = a1.regardingobjectid
    LEFT JOIN Task AS not1
    ON p.contactid = not1.regardingobjectid
    AND not1.activityid < a1.activityid
    LEFT JOIN Task AS a2
    ON p.contactid = a2.regardingobjectid
    AND a2.activityid > a1.activityid
    LEFT JOIN Task AS not2
    ON p.contactid = not2.regardingobjectid
    AND not2.activityid > a1.activityid
    AND not2.activityid < a2.activityid
    LEFT JOIN Task AS a3
    ON p.contactid = a3.regardingobjectid
    AND a3.activityid > a2.activityid
    LEFT JOIN Task AS not3
    ON p.contactid = not3.regardingobjectid
    AND not3.activityid > a2.activityid
    AND not3.activityid < a3.activityid
    LEFT JOIN Task AS a4
    ON p.contactid = a4.regardingobjectid
    AND a4.activityid > a3.activityid
    LEFT JOIN Task AS not4
    ON p.contactid = not4.regardingobjectid
    AND not4.activityid > a3.activityid
    AND not4.activityid < a4.activityid
    LEFT JOIN Task AS a5
    ON p.contactid = a5.regardingobjectid
    AND a5.activityid > a4.activityid
    LEFT JOIN Task AS not5
    ON p.contactid = not5.regardingobjectid
    AND not5.activityid > a4.activityid
    AND not5.activityid < a5.activityid
    WHERE not1.regardingobjectid Is Null
    AND not2.regardingobjectid Is Null
    AND not3.regardingobjectid Is Null
    AND not4.regardingobjectid Is Null
    AND not5.regardingobjectid Is Null


    Using a recursive CTE almost works, except that for each main record it gives a row with one subrecord, another row with two subrecords, a row with three subrecords, and so on for as many subrecords as are available for that main record. It seems like there has to be a way around that, so if you have any ideas, let me know. Performance is not good, not horrible.
    WITH cte AS (SELECT a1.regardingobjectid, a1.activityid
    , 'Activities' = CONVERT(nvarchar(1000), a1.createdon, 113)
    FROM Task AS a1
    INNER JOIN Contact AS p
    ON a1.regardingobjectid = p.contactid
    LEFT JOIN Task AS not1
    ON a1.regardingobjectid = not1.regardingobjectid
    AND a1.activityid > not1.activityid
    WHERE not1.activityid Is Null
    UNION ALL
    SELECT cte.regardingobjectid, a1.activityid
    , 'Activities' = CONVERT(nvarchar(1000), (cte.Activities + N', ' + CONVERT(nvarchar, a1.createdon, 113)))
    FROM cte
    INNER JOIN Task AS a1
    ON cte.regardingobjectid = a1.regardingobjectid
    AND cte.activityid < a1.activityid
    WHERE NOT EXISTS (SELECT *
    FROM Task AS not1
    WHERE cte.regardingobjectid = not1.regardingobjectid
    AND not1.activityid > cte.activityid
    AND not1.activityid < a1.activityid)
    )
    SELECT 'ID' = p.contactid, 'Name' = p.fullname
    , cte.Activities
    FROM cte
    INNER JOIN Contact AS p
    ON cte.regardingobjectid = p.contactid
    ORDER BY p.fullname


    Creating a custom aggregate function in CLR is another solution, but playing with that will have to be another day.

    Source: http://sqlserver.livejournal.com/76055.html

  13. Tracking Database Growth

    Date: 10/10/11 (SQL Server)    Keywords: database, sql

    I came across this article when doing some more research on documenting database growth over time.  It worked really well for me.

    Thank you vyaskn@hotmail.com!

    In this article I am going to explain, how to track file growths, especially the database files. First of all, why is it important to track database file growth? Tracking file growth, helps you understand the rate at which your database is growing, so that you can plan ahead for your future storage needs. It is better to plan ahead, instead of running around when you run out of disk space, isn't it? So, how can we track file growths? There are a couple of ways.

    The first approach:
    SQL Server BACKUP and RESTORE commands store the backup, restore history in the msdb database. In this approach, I am going to use the tables backupset and backupfile from msdb to calculate the file growth percentages. Whenever you backup a database the BACKUP command inserts a row in the backupset table and one row each for every file in the backed-up database in the backupfile table, along with the size of each file. I am going to use these file sizes recorded by BACKUP command, compare them with the previous sizes and come up with the percentage of file growth. This approach assumes that you do full database backups periodically, at regular intervals. 


    Click here to download the procedure sp_track_db_growth. ...

    **Please use free code responsibly.  Test and verify before deploying to production!


    Source: http://sqlserver.livejournal.com/75793.html

  14. DBA Position Open in North Texas

    Date: 07/20/11 (SQL Server)    Keywords: technology, database, sql


    --------------------------- -------------------------------------------
    1-2 years of total
    database experience
    required (preferably
    Oracle, Sybase, or SQL
    Server)
    Experience with Windows
    Server operating systems
    Experience with creating
    SQL scripts and setting up
    typical database
    maintenance jobs
    Experience working with
    development teams
    ------------------------------------------------------------------------
    The Database Administrator
    1 will participate in
    departmental projects by
    assisting in the
    development of project
    plans, documentation, and
    performing of project
    tasks. In support of the
    other DBA’s, He/She will
    perform daily maintenance
    tasks and participate in
    DBA on-call duty. The DBA
    I position will be
    responsible for installing
    and maintaining database
    technology in a
    multi-platform mission
    critical environment.
    ------------------------------------------------------------------------
    CONTACT:
    Jennifer Toal
    Research Analyst
    COMTEK-Group
    972-792-1045 Office
    972-467-2901 Mobile
    972-644-6602 Fax

    Source: http://sqlserver.livejournal.com/75585.html

  15. Production SQL DBA Opening in North Texas

    Date: 06/02/11 (SQL Server)    Keywords: database, asp, sql, security, microsoft

    Passing this along for a friend...If you know anyone looking, please let me know.  Pay terms seem to be a little higher than normal for that many years of experience.  

    Responsibilities:

    • Installation, configuration, customization, maintenance and performance tuning of SQL Server 2005 & 2008 including SSIS, SSAS and SSRS.
    • SQL version migration, patching and security management.
    • Monitor database server capacity/performance and make infrastructure and architecture recommendations to management for necessary changes/updates.
    • Perform database optimization, administration and maintenance (partitioning tables, partitioning indexes, indexing, normalization, synchronization, job monitoring, etc).
    • Manage all aspects of database operations including implementation of database monitoring tools, event monitoring, diagnostic analysis, performance optimization routines and top-tier support for resolving support issues.
    • Work with internal IT operations teams to troubleshoot network and server issues and optimize the database environment.
    • Establish and enforce database change management standards including pushes from development to QA, on to production, etc;
    • Proactively stay current with latest technologies and industry best practices associated to the position and responsibilities.
    • Provide development and production support to troubleshoot day-to-day database or related application issues.
    • Develop, implement and verify processes for system monitoring, storage management, backup and recovery.
    • Develop, implement and verify database backup and disaster recovery strategies.
    • Design and implement all database security to ensure integrity and consistency among the various database regions
    • Develop and maintain documentation of the production environment.
    • Manage SLAs and strict adherence to production controls - Sarbanes-Oxley (SOX) monitored via external audits
    Necessary Qualifications:
    • Must have experience on SQL Server 2005.
    • Good exposure on Installation, Configuration of database Clusters, Replication, Log shipping and Mirroring
    • Expertise in Troubleshooting and performance monitoring SQL Server Database server (Query Tuning, Server Tuning, Disk Performance Monitoring, Memory Pressure, CPU bottleneck etc.)
    • Expertise in T-SQL and writing efficient and highly performing SQL Statements.
    • Expertise in SQL Server Internals, wait events, profiler, windows events etc
    • Must have understanding of key infrastructure technologies such as Clustering, SAN Storage, Virtualization, Cloud services etc.

    Other nice to have experience:
    • System administration fundamentals including Installation, Configuration & Security setups.
    • Experience with SQL 2008 a plus.
    • Experienced in architecting high availability, business resumption and disaster recovery solutions
    • Microsoft SQL Server DBA Certification
    • Experience with SCOM/SCCM/SCSM is a plus
    • Extremely self motivated and ability to work within a globally dispersed team.
    Desired Skills:
    • Data Warehouse experience
    • VLDB experience highly desired
    • Experience with databases > 5 TB, processing 2 million + rows of data daily
    • MS SQL Server 2005 Transact-SQL (T-SQL)
    • Stored Procedure Development Communication Skills, work well with the team, and within team processes
    • Database and file size and space forecasting ability
    • Ability to manage a complex database system and assist the client with Database Integration for Future Business Intelligence efforts
    • Confio Ignite Performance
    Education & Work Experience:
    • Bachelor's degree in Computer Science, Business Administration or other
    • 10+ years experience as a Database Administrator 

    Source: http://sqlserver.livejournal.com/75423.html

  16. Datafile Growth in SQL Server - Getting the Statistics Part II

    Date: 03/11/11 (SQL Server)    Keywords: sql

     In our last entry we talked about getting datafile usage in SQL Server.  Today, we'll implement sp_file_space in another stored procedure that combines it with the extended stored procedure xp_fideddrives to calculate the free space and stores the data in two standard tables.

    CREATE PROCEDURE [dbo].[sp_log_spaceused]
    as
    create table #freespace
    (
    drive char(1) null,
    MBfreespace bigint null
    )


    set nocount on

    delete from #freespace
    -- log this servers current space used
    insert into file_space_log exec.master.dbo.sp_file_space

    -- log the freespace
    insert into #freespace
    (
    drive,
    MBfreespace
    )
    exec master.dbo.xp_fixeddrives

    -- server_drive_space insert
    insert into free_space_log
    select
    drive,
    MBfreespace
    from #freespace


    GO


    ** Please be responsible with free code.  Test and check before implementing in a production environment
     

    Source: http://sqlserver.livejournal.com/73862.html

  17. How do you track datafile growth?

    Date: 03/09/11 (SQL Server)    Keywords: database, sql

     Here's a good question for data environments today.  What methods do you employ to track datafile growth in your SQL Server databases?  Do you use a 3rd-party tool, or do you have a home-brew method?  I'll share my method once we read about other's ideas.

    Source: http://sqlserver.livejournal.com/73430.html

  18. How do you promote scripts?

    Date: 02/15/11 (SQL Server)    Keywords: sql

     It looks like we haven't had much discussion here in quite a while, so as the community owner, I will try to stir some discussion.  

    How do you promote your SQL scripts throughout your development, test, and prod environments?

    Source: http://sqlserver.livejournal.com/73157.html

  19. How to make an Oracle Linked server on SQL 2000

    Date: 09/28/10 (SQL Server)    Keywords: html, sql

    Posting for the benefit of all after personally pulling my hair out on this one...

    1) Installed the Oracle Instant Client following these wonderful directions:
    http://www.dbatoolz.com/t/installing-oracle-instantclient-basic-and-instantclient-sqlplus-on-win32.html

    2) Restarted my SQL server to re-load the ODBC drivers

    3) Created linked server

    USE master
    go
    EXEC sp_addlinkedserver @server=N'ORCL_SRVR', @srvproduct=N'', @provider=N'MSDASQL', @datasrc=N'ORCL_SRVR'
    go
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='rpc', @optvalue='true'
    go
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='collation compatible', @optvalue='false'
    go
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='data access', @optvalue='true'
    go
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='rpc out', @optvalue='false'
    go
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='use remote collation', @optvalue='true'
    go
    EXEC sp_addlinkedsrvlogin @rmtsrvname=N'ORCL_SRVR', @useself='FALSE', @rmtuser=N'system', @rmtpassword=N'my password'
    go
    IF EXISTS (SELECT * FROM master.dbo.sysservers WHERE srvname=N'ORCL_SRVR')
    PRINT N'<<< CREATED LINKED SERVER ORCL_SRVR >>>'
    ELSE
    PRINT N'<<< FAILED CREATING LINKED SERVER ORCL_SRVR >>>'
    go

    Source: http://sqlserver.livejournal.com/72406.html

  20. SQL Server 2005 - Implement account or IP validation using LOGON TRIGGER

    Date: 11/18/09 (SQL Server)    Keywords: asp, sql, security, web, microsoft

    http://technet.microsoft.com/en-us/sqlserver/dd353197.aspx

    Has anyone implemented security using the LOGON TRIGGER that came out with SQL Server 2005 SP2?

    I'm just curious if anyone has setup this feature to protect their SQL Server from attack through their web servers.

    Source: http://sqlserver.livejournal.com/71849.html

Previous page  ||  Next page


antivirus | apache | asp | blogging | browser | bugtracking | cms | crm | css | database | ebay | ecommerce | google | hosting | html | java | jsp | linux | microsoft | mysql | offshore | offshoring | oscommerce | php | postgresql | programming | rss | security | seo | shopping | software | spam | spyware | sql | technology | templates | tracker | virus | web | xml | yahoo | home