1. Data warehouse community

    Date: 12/05/15     Keywords: no keywords

    Dear community members,
    Can anybody help me to find the data warehouse community?

    Source: http://sqlserver.livejournal.com/77637.html

  2. Creating a "SQL Job Launch Shell" for lower-priveleged users

    Date: 02/11/13     Keywords: sql

    This is in response to my question on 2/4/2013 for SQL Version 2000 (should work in subsequent versions if you follow my comments)

    User Table Created w/ Trigger
      CREATE TABLE [dbo].[prod_support_job_queue]  (
        [job_name]     sysname NOT NULL,
        [step_id]      int NOT NULL CONSTRAINT [DF__prod_supp__step___4959E263]  DEFAULT (1),
        [action]       nvarchar(6) NOT NULL,  (Must be either START, CANCEL, or STOP)
        [ntlogin]      nvarchar(32) NULL, --used to log who made the request
        [log_date]     datetime NULL,
        [processed]    char(1) NOT NULL CONSTRAINT [DF_prod_support_job_queue_processed]  DEFAULT ('N')

    CREATE TRIGGER [dbo].[ti_job_queue] on [dbo].[prod_support_job_queue]
    for insert
       set nocount on

       if (
          declare @username varchar(30)
          declare @log_date datetime
          declare @job_name sysname
          -- Get the user's attributes.
            @username = loginame
         where spid = @@spid

        select @log_date = getdate()
        select @job_name = job_name from inserted
        update prod_support_job_queue
        set log_date=@log_date,
             processed ='N'


    • check_job_queue - fires off via scheduled SQL job.  It reads from the prod_support_job_queue table
    • make_job_request - procedure exposed to the production support team.  This helps them insert records into the prod_support_job_queue table
    • sp_isJobRunning - (Modified this procedure from THIS publicly available code in order for it to run on SQL 2000 )
    1. The user makes his request via the make_job_request stored procedure.  He is required to enter a valid job name, action (which is either START, STOP, or CANCEL)  
    2. check_job_queue runs every 10 minutes for check for new actions in the prod_support_job_queue table.  It utilizes system stored procedures in msdb to start and stop jobs.  For the CANCEL command, a simple update statement is issued to the processed field to exclude it from further processing checks.
    3. sp_IsJobRunning is called by check_job_queue in order to see if the requested job is already running before issuing any commands

    I am adding fine-tuning to the check_job_queue procedure.  Once that is done, I'll post the code for the two custom procedures check_job_queue and make_job_request

    Source: http://sqlserver.livejournal.com/77452.html

  3. SQL Job Administrators - SQL 2008 R2

    Date: 02/04/13     Keywords: asp, sql, microsoft

    I'm thinking about doing this because our number of ad-hoc requests to run jobs has increased to an annoying level.  Does anyone out there have experience putting this into practice?

    How to: Configure a User to Create and Manage SQL Server Agent Jobs
    (SQL Server Management Studio)


    Source: http://sqlserver.livejournal.com/77287.html

  4. Question regarding Collations

    Date: 06/08/12     Keywords: database, sql

    Does anyone have any experience on this group in working with Unicode, double-byte case-sensitive data in SQL 2008 R2?

    I would like to select a collation for my database that allows case-sensitive sorting/comparisons with Unicode data that could contain Japanese characters.  Whew...that's hard to say.

    Source: http://sqlserver.livejournal.com/76773.html

  5. SQL Server SP2

    Date: 02/28/12     Keywords: sql

    I have multiple instances on SQL Server 2008. We are planning to install SP2 only on one instance. What impact will be for the rest of instances and especially on shared components. Thank you!!!

    Source: http://sqlserver.livejournal.com/76475.html

  6. Query fun and games

    Date: 02/23/12     Keywords: xml, sql

    I've found in general for SQL that there is more than one way to solve (almost) any problem. I've been playing around with query building today and decided to see how many ways I could solve a problem that recurs fairly frequently in my work, flattening subrecords into a single row.

    This is my current standard solution, using the PIVOT function. It's quite fast, but limits you to a specific number of subrecords--it can be a high number, but you still have to decide on a maximum.

    WITH cte AS (SELECT Person.contactid AS 'ID' , Person.FullName AS 'Name'
    , 'Activity' = Activity.a422_rel_activityvalueidname
    , 'Row' = ROW_NUMBER() OVER (PARTITION BY Person.contactid, Person.FullName ORDER BY Activity.a422_rel_activityvalueidname)
    FROM Contact AS Person
    INNER JOIN Task AS Activity ON Person.contactid = Activity.regardingobjectid)
    SELECT ID, Name
    , 'Activity1' = [1], 'Activity2' = [2], 'Activity3' = [3], 'Activity4' = [4], 'Activity5' = [5]
    FROM cte
    PIVOT (MAX(cte.Activity) FOR cte.[Row] IN ([1], [2], [3], [4], [5])) AS pvt

    This is a new solution I found in surfing some SQL Server blogs, using FOR XML PATH to create a CSV list of values. It will include an indefinite number of subrecords, but only includes one field from the subrecords. It's significantly slower than the first example by at least an order of ten.
    SELECT DISTINCT p.contactid AS 'ID' , p.FullName AS 'Name'
    , SUBSTRING((SELECT ', ' + Activity.a422_rel_activityvalueidname
    FROM task AS Activity
    WHERE Activity.regardingobjectid = p.contactid
    FOR XML PATH('')), 2, 4000) AS 'Activities'
    FROM Contact AS p
    INNER JOIN Task AS t ON p.contactid = t.regardingobjectid
    ORDER BY p.contactid

    This ugly looking creature is what I used to use before PIVOT came along, using many, many multiple self-joins. I'm pretty sure I had a slightly more elegant (and faster!) version of this, but it's been a long time since I've had to create one of these things (fortunately). The performance is...not as bad as you might expect.
    SELECT 'ID' = p.contactid, 'Name' = p.fullname
    , 'Activity1' = a1.a422_rel_activityvalueidname
    , 'ActivityDate1' = a1.actualend
    , 'Activity2' = a2.a422_rel_activityvalueidname
    , 'ActivityDate2' = a2.actualend
    , 'Activity3' = a3.a422_rel_activityvalueidname
    , 'ActivityDate3' = a3.actualend
    , 'Activity4' = a4.a422_rel_activityvalueidname
    , 'ActivityDate4' = a4.actualend
    , 'Activity5' = a5.a422_rel_activityvalueidname
    , 'ActivityDate5' = a5.actualend
    FROM Contact AS p
    INNER JOIN Task AS a1
    ON p.contactid = a1.regardingobjectid
    LEFT JOIN Task AS not1
    ON p.contactid = not1.regardingobjectid
    AND not1.activityid < a1.activityid
    LEFT JOIN Task AS a2
    ON p.contactid = a2.regardingobjectid
    AND a2.activityid > a1.activityid
    LEFT JOIN Task AS not2
    ON p.contactid = not2.regardingobjectid
    AND not2.activityid > a1.activityid
    AND not2.activityid < a2.activityid
    LEFT JOIN Task AS a3
    ON p.contactid = a3.regardingobjectid
    AND a3.activityid > a2.activityid
    LEFT JOIN Task AS not3
    ON p.contactid = not3.regardingobjectid
    AND not3.activityid > a2.activityid
    AND not3.activityid < a3.activityid
    LEFT JOIN Task AS a4
    ON p.contactid = a4.regardingobjectid
    AND a4.activityid > a3.activityid
    LEFT JOIN Task AS not4
    ON p.contactid = not4.regardingobjectid
    AND not4.activityid > a3.activityid
    AND not4.activityid < a4.activityid
    LEFT JOIN Task AS a5
    ON p.contactid = a5.regardingobjectid
    AND a5.activityid > a4.activityid
    LEFT JOIN Task AS not5
    ON p.contactid = not5.regardingobjectid
    AND not5.activityid > a4.activityid
    AND not5.activityid < a5.activityid
    WHERE not1.regardingobjectid Is Null
    AND not2.regardingobjectid Is Null
    AND not3.regardingobjectid Is Null
    AND not4.regardingobjectid Is Null
    AND not5.regardingobjectid Is Null

    Using a recursive CTE almost works, except that for each main record it gives a row with one subrecord, another row with two subrecords, a row with three subrecords, and so on for as many subrecords as are available for that main record. It seems like there has to be a way around that, so if you have any ideas, let me know. Performance is not good, not horrible.
    WITH cte AS (SELECT a1.regardingobjectid, a1.activityid
    , 'Activities' = CONVERT(nvarchar(1000), a1.createdon, 113)
    FROM Task AS a1
    INNER JOIN Contact AS p
    ON a1.regardingobjectid = p.contactid
    LEFT JOIN Task AS not1
    ON a1.regardingobjectid = not1.regardingobjectid
    AND a1.activityid > not1.activityid
    WHERE not1.activityid Is Null
    SELECT cte.regardingobjectid, a1.activityid
    , 'Activities' = CONVERT(nvarchar(1000), (cte.Activities + N', ' + CONVERT(nvarchar, a1.createdon, 113)))
    FROM cte
    INNER JOIN Task AS a1
    ON cte.regardingobjectid = a1.regardingobjectid
    AND cte.activityid < a1.activityid
    FROM Task AS not1
    WHERE cte.regardingobjectid = not1.regardingobjectid
    AND not1.activityid > cte.activityid
    AND not1.activityid < a1.activityid)
    SELECT 'ID' = p.contactid, 'Name' = p.fullname
    , cte.Activities
    FROM cte
    INNER JOIN Contact AS p
    ON cte.regardingobjectid = p.contactid
    ORDER BY p.fullname

    Creating a custom aggregate function in CLR is another solution, but playing with that will have to be another day.

    Source: http://sqlserver.livejournal.com/76055.html

  7. Tracking Database Growth

    Date: 10/10/11     Keywords: database, sql

    I came across this article when doing some more research on documenting database growth over time.  It worked really well for me.

    Thank you vyaskn@hotmail.com!

    In this article I am going to explain, how to track file growths, especially the database files. First of all, why is it important to track database file growth? Tracking file growth, helps you understand the rate at which your database is growing, so that you can plan ahead for your future storage needs. It is better to plan ahead, instead of running around when you run out of disk space, isn't it? So, how can we track file growths? There are a couple of ways.

    The first approach:
    SQL Server BACKUP and RESTORE commands store the backup, restore history in the msdb database. In this approach, I am going to use the tables backupset and backupfile from msdb to calculate the file growth percentages. Whenever you backup a database the BACKUP command inserts a row in the backupset table and one row each for every file in the backed-up database in the backupfile table, along with the size of each file. I am going to use these file sizes recorded by BACKUP command, compare them with the previous sizes and come up with the percentage of file growth. This approach assumes that you do full database backups periodically, at regular intervals. 

    Click here to download the procedure sp_track_db_growth. ...

    **Please use free code responsibly.  Test and verify before deploying to production!

    Source: http://sqlserver.livejournal.com/75793.html

  8. DBA Position Open in North Texas

    Date: 07/20/11     Keywords: technology, database, sql

    --------------------------- -------------------------------------------
    1-2 years of total
    database experience
    required (preferably
    Oracle, Sybase, or SQL
    Experience with Windows
    Server operating systems
    Experience with creating
    SQL scripts and setting up
    typical database
    maintenance jobs
    Experience working with
    development teams
    The Database Administrator
    1 will participate in
    departmental projects by
    assisting in the
    development of project
    plans, documentation, and
    performing of project
    tasks. In support of the
    other DBA’s, He/She will
    perform daily maintenance
    tasks and participate in
    DBA on-call duty. The DBA
    I position will be
    responsible for installing
    and maintaining database
    technology in a
    multi-platform mission
    critical environment.
    Jennifer Toal
    Research Analyst
    972-792-1045 Office
    972-467-2901 Mobile
    972-644-6602 Fax

    Source: http://sqlserver.livejournal.com/75585.html

  9. Production SQL DBA Opening in North Texas

    Date: 06/02/11     Keywords: database, asp, sql, security, microsoft

    Passing this along for a friend...If you know anyone looking, please let me know.  Pay terms seem to be a little higher than normal for that many years of experience.  


    • Installation, configuration, customization, maintenance and performance tuning of SQL Server 2005 & 2008 including SSIS, SSAS and SSRS.
    • SQL version migration, patching and security management.
    • Monitor database server capacity/performance and make infrastructure and architecture recommendations to management for necessary changes/updates.
    • Perform database optimization, administration and maintenance (partitioning tables, partitioning indexes, indexing, normalization, synchronization, job monitoring, etc).
    • Manage all aspects of database operations including implementation of database monitoring tools, event monitoring, diagnostic analysis, performance optimization routines and top-tier support for resolving support issues.
    • Work with internal IT operations teams to troubleshoot network and server issues and optimize the database environment.
    • Establish and enforce database change management standards including pushes from development to QA, on to production, etc;
    • Proactively stay current with latest technologies and industry best practices associated to the position and responsibilities.
    • Provide development and production support to troubleshoot day-to-day database or related application issues.
    • Develop, implement and verify processes for system monitoring, storage management, backup and recovery.
    • Develop, implement and verify database backup and disaster recovery strategies.
    • Design and implement all database security to ensure integrity and consistency among the various database regions
    • Develop and maintain documentation of the production environment.
    • Manage SLAs and strict adherence to production controls - Sarbanes-Oxley (SOX) monitored via external audits
    Necessary Qualifications:
    • Must have experience on SQL Server 2005.
    • Good exposure on Installation, Configuration of database Clusters, Replication, Log shipping and Mirroring
    • Expertise in Troubleshooting and performance monitoring SQL Server Database server (Query Tuning, Server Tuning, Disk Performance Monitoring, Memory Pressure, CPU bottleneck etc.)
    • Expertise in T-SQL and writing efficient and highly performing SQL Statements.
    • Expertise in SQL Server Internals, wait events, profiler, windows events etc
    • Must have understanding of key infrastructure technologies such as Clustering, SAN Storage, Virtualization, Cloud services etc.

    Other nice to have experience:
    • System administration fundamentals including Installation, Configuration & Security setups.
    • Experience with SQL 2008 a plus.
    • Experienced in architecting high availability, business resumption and disaster recovery solutions
    • Microsoft SQL Server DBA Certification
    • Experience with SCOM/SCCM/SCSM is a plus
    • Extremely self motivated and ability to work within a globally dispersed team.
    Desired Skills:
    • Data Warehouse experience
    • VLDB experience highly desired
    • Experience with databases > 5 TB, processing 2 million + rows of data daily
    • MS SQL Server 2005 Transact-SQL (T-SQL)
    • Stored Procedure Development Communication Skills, work well with the team, and within team processes
    • Database and file size and space forecasting ability
    • Ability to manage a complex database system and assist the client with Database Integration for Future Business Intelligence efforts
    • Confio Ignite Performance
    Education & Work Experience:
    • Bachelor's degree in Computer Science, Business Administration or other
    • 10+ years experience as a Database Administrator 

    Source: http://sqlserver.livejournal.com/75423.html

  10. Microsoft Tech-Ed

    Date: 05/12/11     Keywords: no keywords

     Anyone going to this next week?  Is there anything I can report back to the group about if you are interested in going but can't?

    Microsoft Tech-Ed 2011

    Source: http://sqlserver.livejournal.com/75106.html

  11. Starting with the Specs: Building Solid Code Review Procedure

    Date: 05/06/11     Keywords: database, security

    In our last entry, we introduced the concept of code review procedures.  Our first topic to consider in this life cycle is for the developer to take some time to understand the Business Requirements and Functional context.  These two critical tasks should in a perfect world be understood by all dba's in the SDLC of database code, but the developer has a unique opportunity to let his/her code communicate these requirements and context though coding best practices and adequate documentation.  Some items a developer, or a peer can look for in performing these 2 steps are the following:

    Satisfying Business Requirements & Functional Context

    • Has a knowledgeable user been consulted during the planning/architecture phase of code creation?

    • Did the architect make specifications for future growth and change needs of the application?

    • Has the developer reviewed the business requirements?

    • Do the developer and the business have the same understanding for required performance of the application?

    • Does the reviewer understand the code being reviewed?

    • Does your code adhere to corporate coding specifications (Yes, this is a business requirement, too)

    • At what layer in your business environment does the code execute?

    • Does the piece of code functionally achieve the stakeholder's need as documented in the project charter ?

    • What is the data size and volume worked with in this code?

    • What are the data archival requirements?

    • Have company security policies been complied with?

    • How will the application or change be installed and configured?

    • By what method will the development team preserve and version the code and objects affected?

    ( Thanks to [info]adina_atl for assisting with the checklist )

    Source: http://sqlserver.livejournal.com/74884.html

  12. Building Solid Code Review Procedures

    Date: 04/18/11     Keywords: security

    Does your organization use any code review procedures when promoting code from Development through to Production?  If you work for a publicly-held corporation, you know this all to well.  If you work for a privately-held company, you can benefit from this procedure as well.  Wether you are a developer wanting to make sure all your i's are dotted and all your t's are crossed, or a DBA in operations getting ready to deploy new code, solid code review procedures can really save your butt from major mistakes that can cost you hours of work.

    First, let's get a general idea of what a code review life cycle will look like and who is generally involved.  

    1. Unit Test - Validate Business Requirements and Functional Context
    2. Unit Test - Validate Syntax, Entry and Exit points
    3. Unit Test & QA - Consistency - Perform Version Control & Transactional Analysis
    4. QA & Prod - Security - Permissions, Back-out and Error Handling
    5. QA & Prod - Performance - Validate Memory Usage, Thread Usage and Control Structure usage
    6.  Prod - Documentation - Is there a header, change log, and comments ?  Are deployment instructions required and included?

    Source: http://sqlserver.livejournal.com/74700.html

  13. Datafile Growth in SQL Server - Checking the Statistics

    Date: 03/18/11     Keywords: no keywords

    In our last two topics, we discussed gathering the space statistics.

    Now we need to devise a procedure called sp_check_usage that will check our statistics and calculate when we expect to run out of space.  We will also use a 2 new tables called drive_check_log and server_space_report to record our data for reporting purposes.  For the first day, we will not have any data to report because we will need a second day's worth of statistics to compare the data against.

    Now the new procedure will make a couple of passes at the stored data, use some temp tables to do a comparison, then will write a report you can send or review.

    CREATE PROCEDURE [dbo].[sp_check_usage]

    set nocount on

    declare @since_date datetime,
    @rc1 INT,
    @files INT,
    @cmd VARCHAR(256),
    @datfile1 VARCHAR(128),
    @subject nvarchar(500)

    set @since_date = dateadd(d,-7,getdate())

    SELECT @datfile1 = '\RPT' + CONVERT(CHAR(8),GETDATE(),112) + '.CSV'

    -- group by days
    MBFreeSpace= sum(MBFreeSpace) / count(*),
    sample_count = count(*)
    into #x1
    from free_space_log t1 inner join drive_check_log t2
    on t1.id = t2.id
    where t2.datestamp >= @since_date
    group by t1.drive, convert(varchar,datestamp,111)

    -- uncomment to debug
    --select * from #x1

    -- get a days growth
    avg_per_day =
    sum(t2.MBFreeSpace - t1.MBFreeSpace ) / count(*)
    into #x3
    from #x1 t1 inner join #x1 t2 on
    t1.drive = t2.drive and
    t1.datestamp = convert(varchar,dateadd(d,1,t2.datestamp),111)
    group by

    select * from #x3

    -- get the latest sample date
    into #x2
    from free_space_log t1 (nolock) inner join drive_check_log t2 (nolock)
    on t1.id = t2.id
    group by

    -- set the current free space

    update #x2 set
    CurrentMBFree = MBFreeSpace
    from free_space_log t1 (nolock)
    #x2.drive = t1.drive and
    #x2.last_sample_id = t1.id

    select * from #x2

    --This is where the procedure produces the output

    truncate table server_space_report

    insert server_space_report ( drive,
    days_left =
    case when #x3.avg_per_day = 0 then 'low or no usage'
    when #x3.avg_per_day < 0 then 'negative usage'
    else 'Out of space in ' +
    replicate(' ', 6 - datalength(convert(varchar, round((#x2.CurrentMBFree / #x3.avg_per_day),0)))) +
    convert(varchar, round((#x2.CurrentMBFree / #x3.avg_per_day),0)) + ' days'
    from #x2
    inner join #x3 on
    #x2.drive = #x3.drive
    order by #x2.drive

    -- uncomment to debug
    -- select * from server_space_report

    set nocount off

    ** Please be responsible with free code. Test and check before implementing in a production environment

    Source: http://sqlserver.livejournal.com/74374.html

  14. Datafile Growth in SQL Server - Getting the Statistics Part II

    Date: 03/11/11     Keywords: sql

     In our last entry we talked about getting datafile usage in SQL Server.  Today, we'll implement sp_file_space in another stored procedure that combines it with the extended stored procedure xp_fideddrives to calculate the free space and stores the data in two standard tables.

    CREATE PROCEDURE [dbo].[sp_log_spaceused]
    create table #freespace
    drive char(1) null,
    MBfreespace bigint null

    set nocount on

    delete from #freespace
    -- log this servers current space used
    insert into file_space_log exec.master.dbo.sp_file_space

    -- log the freespace
    insert into #freespace
    exec master.dbo.xp_fixeddrives

    -- server_drive_space insert
    insert into free_space_log
    from #freespace


    ** Please be responsible with free code.  Test and check before implementing in a production environment

    Source: http://sqlserver.livejournal.com/73862.html

  15. Datafile Growth in SQL Server - Getting the Statistics Part I

    Date: 03/10/11     Keywords: database

     We create a database called ADMIN which stores our administrative information such as file space statistics. We use a combination of extended stored procedures and publicly-available code to log these statistics. Here is a samples:

    /* Get current space statistics. You can run this and store the results in a holding table. */

    CREATE PROCEDURE [dbo].[sp_file_space] @server_name sysname, @id int

    declare @dbname sysname
    declare @cmd varchar(700),
    @lname_len int,
    @fname_len int,
    @fgroup_len int

    set nocount on

    select @cmd = 'use [?] select ' + convert(varchar,@id) + ',''' + rtrim(@server_name) + ''',db_name(),logical_name = name,fileid,upper(substring(filename,1,1)),filename,
    filegroup = filegroup_name(groupid),
    size_in_KB = size * 8,
    maxsize_in_KB = (case maxsize when -1 then 0
    maxsize * 8 end),
    growth = (case status & 0x100000 when 0x100000 then growth else growth * 8 end),
    KB_growth_flag = (case status & 0x100000 when 0x100000 then 0 else 1 end) ,
    usage = (case status & 0x40 when 0x40 then ''Log Only'' else ''Data Only'' end)
    from sysfiles order by fileid'

    exec sp_MSforeachdb @command1=@cmd

    return 0


    ** Please be responsible with free code. Test and check before implementing in a production environment

    Source: http://sqlserver.livejournal.com/73512.html

  16. How do you track datafile growth?

    Date: 03/09/11     Keywords: database, sql

     Here's a good question for data environments today.  What methods do you employ to track datafile growth in your SQL Server databases?  Do you use a 3rd-party tool, or do you have a home-brew method?  I'll share my method once we read about other's ideas.

    Source: http://sqlserver.livejournal.com/73430.html

  17. How do you promote scripts?

    Date: 02/15/11     Keywords: sql

     It looks like we haven't had much discussion here in quite a while, so as the community owner, I will try to stir some discussion.  

    How do you promote your SQL scripts throughout your development, test, and prod environments?

    Source: http://sqlserver.livejournal.com/73157.html

  18. Need a hack for changing default db

    Date: 11/08/10     Keywords: database, security

    I have a user who locked himself out of a database because his default db went into suspect mode.  His security policy was nice enough to bar anyone in the Windows Administrators group from logging in to the db.  He says he can't remember the two passwords for the administrative logins currently assigned to System Administrator  role on the server.  Any hope here?  I think he's screwed, personally.

    Source: http://sqlserver.livejournal.com/72656.html

  19. How to make an Oracle Linked server on SQL 2000

    Date: 09/28/10     Keywords: html, sql

    Posting for the benefit of all after personally pulling my hair out on this one...

    1) Installed the Oracle Instant Client following these wonderful directions:

    2) Restarted my SQL server to re-load the ODBC drivers

    3) Created linked server

    USE master
    EXEC sp_addlinkedserver @server=N'ORCL_SRVR', @srvproduct=N'', @provider=N'MSDASQL', @datasrc=N'ORCL_SRVR'
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='rpc', @optvalue='true'
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='collation compatible', @optvalue='false'
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='data access', @optvalue='true'
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='rpc out', @optvalue='false'
    EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='use remote collation', @optvalue='true'
    EXEC sp_addlinkedsrvlogin @rmtsrvname=N'ORCL_SRVR', @useself='FALSE', @rmtuser=N'system', @rmtpassword=N'my password'
    IF EXISTS (SELECT * FROM master.dbo.sysservers WHERE srvname=N'ORCL_SRVR')

    Source: http://sqlserver.livejournal.com/72406.html

  20. SQL Server 2005 - Implement account or IP validation using LOGON TRIGGER

    Date: 11/18/09     Keywords: asp, sql, security, web, microsoft


    Has anyone implemented security using the LOGON TRIGGER that came out with SQL Server 2005 SP2?

    I'm just curious if anyone has setup this feature to protect their SQL Server from attack through their web servers.

    Source: http://sqlserver.livejournal.com/71849.html

  ||  Next page

antivirus | apache | asp | blogging | browser | bugtracking | cms | crm | css | database | ebay | ecommerce | google | hosting | html | java | jsp | linux | microsoft | mysql | offshore | offshoring | oscommerce | php | postgresql | programming | rss | security | seo | shopping | software | spam | spyware | sql | technology | templates | tracker | virus | web | xml | yahoo | home